Pages

Search This Blog

Friday, December 24, 2010

[ORACLE] How to import dump file to different table space

1.With your .DMP file, create a SQL file containing the structure (Tables):
imp evolvrepo/password@ORCL file=evolvrepo.dmp indexfile=index.sql full=y

2.Open the indexfile (index.sql) in a text editor that can do find and replace over an entire file, and issue the following find and replace statements IN ORDER (ignore the single quotes.. '):

a.Find: 'REM' Replace:

b.Find: '""' Replace: '"#tablespace#"'

c.Find: '...' Replace: 'REM ...'

d.Find: 'CONNECT' Replace: 'REM CONNECT'


3.Save the indexfile, then run it at command prompt
echo exit|sqlplus evolvrepo/password@ORCL @index.sql

4.Finally run the same .DMP file you created the indexfile with against the same account to import the data, stored procedures, views etc:
imp evolvrepo/password@ORCL full=y file=evolvrepo.dmp buffer=326000 log=evolvrepo.log


Note: Replace #tablespace# with table space name and #TEMP1# with temporary tablespace name.

Tuesday, December 14, 2010

[T-SQL] Paging in SQL Server 2011

So here is now the new feature of sql server in its latest edition , so here is how you can use paging , i mean server side paging for fetching rows or result set in pages.

USE AdventureWorks2008R2
GO
DECLARE @RowsPerPage INT = 10, @PageNumber INT = 5
SELECT *
FROM Sales.SalesOrderDetail
ORDER BY SalesOrderDetailID
OFFSET @PageNumber*@RowsPerPage ROWS
FETCH NEXT 10 ROWS ONLY
GO

Wednesday, December 8, 2010

[Informatica] Powermart Command Prompt pmcmd

Here is the way how you can execute the workflow from command prompt:

./pmcmd
pmcmd> connect -sv TestingRepoIntegrationService1 -d Primatics_corvus.primaticsfinancial.com -u uname -p password
pmcmd> setfolder "Demo Mapping"
pmcmd> startworkflow Main_Model_Bank

---- Description
sv: Integration Service Name
d: Domain Name
u: username
p: password
setfolder: tell the folder name where workflow has been placed
startworkflow : tell the workflow name

[T-SQL] Find Space Used For Any Particular Table

Use Temp;
Go
EXEC sp_spaceused 'Temp1';
GO

You will get the following details:

1. Name
2. Rows
3. Reserved
4. Data
5. Index Size
6. Unused

Monday, December 6, 2010

[Informatica] Tutorials

Here are some initial tutorials for how to work with informatica.

How to create mapping , session , workflow using Informatica Powercenter:

http://www.youtube.com/watch?v=zkJoet5jBtE

Workflow Properties:

http://www.youtube.com/watch?v=2FiFLaNkwnY

Let me know if you have any queries regarding any steps that are defined in between.

[Informatica] New Label

I have started working on Informatica , so now i would also start posting some good material realated to Informatica with the label [Informatica]

I am working on the following tools:

Informatica 9.0.1
=> Power Center Designer
=> Power Center Workflow Manager
=> Power Center Workflow Monitor

For any queries , do let me know.

Sunday, November 28, 2010

[T-SQL] What to do , if mdf file get corrupted

Today one of my old friend asked help that he has got mdf file corrupted so what should he do.

So normally there are different excersies which one can do if he knows about the structure of sql server but i am suggesting two tool which you can download and get the task done.:

http://www.mdfrepair.com/
http://www.nucleustechnologies.com/sql-recovery.html

Download it and there are simple steps what you can do to get your db repaired.

Different versions of SQL Server 2008

Choosing the right SQL Server 2008 Edition is something that requires a bit of a research. The first step is of course to become familiar with the available options. We have seen this question pop up a lot of times at client sites as well as in newsgroups and the e-mails that we get from the readers. Microsoft already has very good, concise and to the point information on the different editions, their capabilities as well as a comparison matrix between Enterprise & Standard Edition as well as a comparison matrix between Compact & Express Edition. There are a lot of choices available for you and depending upon your needs, you can choose from (all the details on these can be obtained from the link above):

•Enterprise Edition
•Standard Edition
•Workgroup Edition
•Web Edition
•Compact Edition
•Express Edition
◦SQL Server 2008 Express Edition.
◦SQL Server 2008 Express with Tools.
◦SQL Server 2008 Express with Advanced Tools.

Full comparison sheet link is given below:

http://download.microsoft.com/download/2/d/f/2df66c0c-fff2-4f2e-b739-bf4581cee533/SQLServer%202008CompareEnterpriseStandard.pdf

Friday, November 26, 2010

Comparison between prices of SQL Server and Oracle

SQL Server is a feature-rich and economical choice compared to Oracle. The base product of Oracle is expensive and to add all the features that are offered by the SQL Server, it requires many more different add-ons. These extra add-ons further increase the price to make SQL Server much more affordable than Oracle, which is ridiculously expensive.

I suggest that instead of me taking the side of SQL Server, you should compare the prices of the products yourself. The prices that are listed here are as per the process.

SQL Server Standard: $7,171
SQL Server Enterprise: $27,495

Oracle Standard One: $5,800
Oracle Standard: $17,500
Oracle Enterprise: $47,500

Additionally, in case of Oracle, you will have to purchase many additional features which are already available at NO COST in SQL Server Enterprise Edition. For example, in the SQL Server Enterprise edition, partitioning is a part of the package, whereas in the case of Oracle, you will have to pay additional 11,500 per processor along with the enterprise edition purchase. If you have a four-socket CPU, you end up paying almost the same price as Oracle Enterprise. Again this is just one feature we talked about. There are many additional add-ons available for Oracle, which are free in the SQL Server Enterprise or Standard edition (e.g. BI Tools).

*Prices listed are taken from the official pdf that were available on both companies website on july-2010.

Sunday, November 21, 2010

[T-SQL] How to create new error log file

There are total two methods to do this:

1. Restart the server
2. EXEC sp_cycle_errorlog

First option is not a suitable one as we restart the server every time.

Hope this post is helpful to many of the dba's.

Saturday, November 13, 2010

[T-SQL] Drop Database - Common Error

Normally when you try to drop the database , you get common error that its current in use. So here is proper script for how you should drop the database. But please make sure what you know what you are doing , as dropping database on Production can be a big issue.

USE MASTER
GO
ALTER DATABASE Mabbas_DB
SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
DROP DATABASE Mabbas_DB
GO

Wednesday, November 10, 2010

[T-SQL] Last Backup time for all DBs

One more script to share

SELECT sdb.Name AS DatabaseName,
COALESCE(CONVERT(VARCHAR(12), MAX(bus.backup_finish_date), 101),'-') AS LastBackUpTime
FROM sys.sysdatabases sdb
LEFT OUTER JOIN msdb.dbo.backupset bus ON bus.database_name = sdb.name
GROUP BY sdb.Name

[T-SQL] DB Backup History

Today i was visiting one blog(Pinal Dave) and found a very interesting query for getting database backup history.

SELECT TOP 100
s.database_name,
m.physical_device_name,
CAST(CAST(s.backup_size / 1000000 AS INT) AS VARCHAR(14)) + ' ' + 'MB' AS bkSize,
CAST(DATEDIFF(second, s.backup_start_date,
s.backup_finish_date) AS VARCHAR(4)) + ' ' + 'Seconds' TimeTaken,
s.backup_start_date,
CAST(s.first_lsn AS VARCHAR(50)) AS first_lsn,
CAST(s.last_lsn AS VARCHAR(50)) AS last_lsn,
CASE s.[type]
WHEN 'D' THEN 'Full'
WHEN 'I' THEN 'Differential'
WHEN 'L' THEN 'Transaction Log'
END AS BackupType,
s.server_name,
s.recovery_model
FROM msdb.dbo.backupset s
INNER JOIN msdb.dbo.backupmediafamily m ON s.media_set_id = m.media_set_id
WHERE s.database_name = DB_NAME() -- Remove this line for all the database
ORDER BY backup_start_date DESC, backup_finish_date
GO

Monday, November 8, 2010

[T-SQL] Delete duplicate without ID

Here is a query by which you can delete duplicate rows without using identity column

/* Delete Duplicate records */
WITH CTE (COl1,Col2, DuplicateCount)
AS
(
SELECT COl1,Col2,
ROW_NUMBER() OVER(PARTITION BY COl1,Col2 ORDER BY Col1) AS DuplicateCount
FROM DuplicateRcordTable
)
DELETE
FROM CTE
WHERE DuplicateCount > 1
GO

Friday, November 5, 2010

[T-SQL] Union vs. Union All

Now a days , i am taking interview for the post of DWH Engineer and a common question that i am asking is the above one , but i dont know why people dont know this.

So here is the basic definition:

UNION
The UNION command is used to select related information from two tables, much like the JOIN command. However, when using the UNION command all selected columns need to be of the same data type. With UNION, only distinct values are selected.

UNION ALL
The UNION ALL command is equal to the UNION command, except that UNION ALL selects all values.

The difference between Union and Union all is that Union all will not eliminate duplicate rows, instead it just pulls all rows from all tables fitting your query specifics and combines them into a table.

If we talk about the performance , then it should be known that Union All is much faster then Union as Union involves distinct clause which itself is a heavy operation.

Thursday, November 4, 2010

[T-SQL] Find Busiest Database(s)

DMV sys.dm_exec_query_stats contained columns total_logical_reads, total_logical_writes, sql_handle. Column sql_handle can help to to determine the original query by CROSS JOINing DMF sys.dm_exec_sql_text. From DMF sys.dm_exec_sql_text Database ID and from Database ID can be figured out very quickly.

SELECT SUM(deqs.total_logical_reads) TotalPageReads,
SUM(deqs.total_logical_writes) TotalPageWrites,
CASE
WHEN DB_NAME(dest.dbid) IS NULL THEN 'AdhocSQL'
ELSE DB_NAME(dest.dbid) END Databasename
FROM sys.dm_exec_query_stats deqs
CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest
GROUP BY DB_NAME(dest.dbid)

Monday, November 1, 2010

[T-SQL] Taking backup at two places same time

If you want to take backup of any DB at two places same time then following tsql need to be executed.

BACKUP DATABASE [AdventureWorks] TO
DISK = N'D:\AdventureWorks.bak'
MIRROR TO DISK = N'E:\AdventureWorks.bak'
WITH FORMAT, INIT,
NAME = N'AdventureWorks-Full Database Backup'
GO

Dont use the below tsql as it would break your db backup into two file and it would not serve the purpose. I mean you would be needing both of these files when you are restoring the database.

BACKUP DATABASE [AdventureWorks] TO
DISK = N'D:\AdventureWorks.bak'
, DISK = N'E:\AdventureWorks.bak'
WITH FORMAT, INIT,
NAME = N'AdventureWorks-Full Database Backup'
GO

Wednesday, October 27, 2010

[T-SQL] Difference between @@IDENTITY vs SCOPE_IDENTITY() vs IDENT_CURRENT

SELECT @@IDENTITYIt returns the last IDENTITY value produced on a connection, regardless of the table that produced the value, and regardless of the scope of the statement that produced the value.
@@IDENTITY will return the last identity value entered into a table in your current session. While @@IDENTITY is limited to the current session, it is not limited to the current scope. If you have a trigger on a table that causes an identity to be created in another table, you will get the identity that was created last, even if it was the trigger that created it.

SELECT SCOPE_IDENTITY()It returns the last IDENTITY value produced on a connection and by a statement in the same scope, regardless of the table that produced the value.
SCOPE_IDENTITY(), like @@IDENTITY, will return the last identity value created in the current session, but it will also limit it to your current scope as well. In other words, it will return the last identity value that you explicitly created, rather than any identity that was created by a trigger or a user defined function.

SELECT IDENT_CURRENT(‘tablename’)It returns the last IDENTITY value produced in a table, regardless of the connection that created the value, and regardless of the scope of the statement that produced the value.
IDENT_CURRENT is not limited by scope and session; it is limited to a specified table. IDENT_CURRENT returns the identity value generated for a specific table in any session and any scope.

Monday, October 25, 2010

[T-SQL] Difference between Index Scan and Index Seek

Index Scan:Since a scan touches every row in the table whether or not it qualifies, the cost is proportional to the total number of rows in the table. Thus, a scan is an efficient strategy if the table is small or if most of the rows qualify for the predicate.

Index Seek:Since a seek only touches rows that qualify and pages that contain these qualifying rows, the cost is proportional to the number of qualifying rows and pages rather than to the total number of rows in the table.

Thursday, October 21, 2010

[T-SQL] How to Rename a Column Name or Table Name

This morning i needed to rename column name and table name itself , so tought to share it on blog.

The script for renaming any column :
sp_RENAME 'TableName.[OldColumnName]' , '[NewColumnName]', 'COLUMN'

The script for renaming any object (table, sp etc) :
sp_RENAME 'OldTableName' , 'NewTableName'

This example renames the customers table to custs.

EXEC sp_rename 'customers', 'custs'

This example renames the contact title column in the customers table to title.

EXEC sp_rename 'customers.[contact title]', 'title', 'COLUMN'

Wednesday, October 20, 2010

[T-SQL] Add new column [Not Null] column in a table already having data

This is a very simple question and asked many times, and answer is much simpler then the question.

Here is one example that we have to add a Transactiondate column into table CommissionEarned.

So here are the steps:

1. Add column without specifying not null
ALTER TABLE COMMISSIONEARNED ADD [CALCULATEDDATE] [datetime]
2. Add default constraint if its your business requirement
ALTER TABLE COMMISSIONEARNED ADD CONSTRAINT [DF_CE_CALCULATEDDATE] DEFAULT (getdate()) FOR [CALCULATEDDATE]
3. Update field with the data according to your business logic
UPDATE COMMISSIONEARNED SET CALCULATEDDATE=GETDATE()
4. Update field so that it should not entertain not null any more
ALTER TABLE COMMISSIONEARNED ALTER COLUMN [CALCULATEDDATE] [datetime] NOT NULL

Let me know for any questions

Monday, October 18, 2010

[T-SQL] My favourite SQL shortcuts

Today i would tell you my favourite sql shortcuts and the way to setup.

In SSMS , go to Tools and then Options.

In the tab of Options , you can setup what sql statements you run often.

You can execute them very simply , just highligh the table and press what ever key you have configured.

Wednesday, October 13, 2010

[T-SQL] Stored Procedure Advantage

One of the SQL Expert was asked this question and which one is the best advantage, so here is the answer:

There are many advantages of Stored Procedures. I was once asked what do I think is the most important feature of Stored Procedure? I have to pick only ONE. It is tough question.
I answered : Execution Plan Retention and Reuse (SP are compiled and their execution plan is cached and used again to when the same SP is executed again)

Not to mentioned I received the second question following my answer : Why? Because all the other advantage known (they are mentioned below) of SP can be achieved without using SP. Though Execution Plan Retention and Reuse can only be achieved using Stored Procedure only.

Execution plan retention and reuse
Query auto-parameterization
Encapsulation of business rules and policies
Application modularization
Sharing of application logic between applications
Access to database objects that is both secure and uniform
Consistent, safe data modification
Network bandwidth conservation
Support for automatic execution at system start-up
Enhanced hardware and software capabilities
Improved security
Reduced development cost and increased reliability
Centralized security, administration, and maintenance for common routines

[T-SQL] Move TempDB from one drive to another drive if TempDB is full.

If you come across following errors in log file, please follow the direction below.

Source: MSSQLSERVER
Event ID: 17052
Description: The LOG FILE FOR DATABASE 'tempdb' IS FULL.
Back up the TRANSACTION LOG FOR the DATABASE TO free
up SOME LOG SPACE

Make sure that TempDB is set to autogrow and do not set a maximum size for TempDB. If the current drive is too full to allow autogrow events, then arrange a bigger drive, or add files to TempDB on another device (using ALTER DATABASE as described below and allow those files to autogrow.

Move TempDB from one drive to another drive. There are major two reasons why TempDB needs to move from one drive to other drive.
1) TempDB grows big and the existing drive does not have enough space.
2) Moving TempDB to another file group which is on different physical drive helps to improve database disk read, as they can be read simultaneously.

Follow direction below exactly to move database and log from one drive (c:) to another drive (d:) and (e:).

Open Query Analyzer and connect to your server. Run this script to get the names of the files used for TempDB.
USE TempDB
GO
EXEC sp_helpfile
GO

Results will be something like:
name fileid filename filegroup size
——- —— ————————————————————– ———- ——-
tempdev 1 C:Program FilesMicrosoft SQL ServerMSSQLdatatempdb.mdf PRIMARY 16000 KB
templog 2 C:Program FilesMicrosoft SQL ServerMSSQLdatatemplog.ldf NULL 1024 KB
along with other information related to the database. The names of the files are usually tempdev and demplog by default. These names will be used in next statement. Run following code, to move mdf and ldf files.
USE master
GO
ALTER DATABASE TempDB MODIFY FILE
(NAME = tempdev, FILENAME = 'd:datatempdb.mdf')
GO
ALTER DATABASE TempDB MODIFY FILE
(NAME = templog, FILENAME = 'e:datatemplog.ldf')
GO

The definition of the TempDB is changed. However, no changes are made to TempDB till SQL Server restarts. Please stop and restart SQL Server and it will create TempDB files in new locations.

[T-SQL] How to remove duplicate data

One of the very common question that is asked many times , tell the way how can we delete the duplicate data or an easiest way or fastest way.

This method has one limitation that the table should have a primary key.

DELETE
FROM TableName
WHERE ID NOT IN
(
SELECT MAX(ID)
FROM TableName
GROUP BY Col1, Col2, Col3)

*TableName should be replaced with Table Name
*Col1 , Col2 and Col3 must be replaced with the criteria on which records will qualify as duplicate. It can be increase and decrease.

[T-SQL] How to generate CSV from Table Data

Here is a very simple code:

DECLARE @Str VARCHAR(MAX)
SELECT @Str = COALESCE(@Str+',' , '') + FieldName
FROM TableName
SELECT @Str

*Field Name should be replaced with Field Name
*Table Name should be replaced with Table Name

[T-SQL] Query Analyzer Shortcuts

ShortcutFunctionShortcutFunction
ALT+BREAKCancel a queryCTRL+SHIFT+F2Clear all bookmarks
ALT+F1Database object informationCTRL+SHIFT+INSERTInsert a template
ALT+F4ExitCTRL+SHIFT+LMake selection lowercase
CTRL+ASelect allCTRL+SHIFT+MReplace template parameters
CTRL+BMove the splitterCTRL+SHIFT+POpen
CTRL+CCopyCTRL+SHIFT+RRemove comment
CTRL+DDisplay results in grid formatCTRL+SHIFT+SShow client statistics
CTRL+DeleteDelete through the end of the lineCTRL+SHIFT+TShow server trace
CTRL+EExecute queryCTRL+SHIFT+UMake selection uppercase
CTRL+FFindCTRL+TDisplay results in text format
CTRL+F2Insert/remove bookmarkCTRL+UChange database
CTRL+F4DisconnectCTRL+VPaste
CTRL+F5Parse query and check syntaxCTRL+WWindow selector
CTRL+GGo to lineCTRL+XDelete
CTRL+HReplaceCTRL+ZUndo
CTRL+IIndex Tuning WizardF1Help for Query Analyzer
CTRL+KDisplay/hide execution planF2Move to next bookmark
CTRL+LDisplay execution planF3Repeat last search
CTRL+NNew query windowF4Object Search
CTRL+OConnectF5Execute a query
CTRL+PPrintF6Switch between query and result panes
CTRL+RShow/Hide results paneF8Show/hide Object Browser
CTRL+SSaveSHIFT+F1Transact-SQL help
CTRL+SHIFT+0Show optionsSHIFT+F2Move to previous bookmark
CTRL+SHIFT+CComment out codeSHIFT+TABDecrease indent
CTRL+SHIFT+DELClear the active Editor paneSHIFT+F6Switch panes
CTRL+SHIFT+FSave results to fileTABIncrease indent

Monday, October 11, 2010

[T-SQL] What is – DML, DDL, DCL and TCL – Introduction and Examples

DML

DML is abbreviation of Data Manipulation Language. It is used to retrieve, store, modify, delete, insert and update data in database.

Examples: SELECT, UPDATE, INSERT statements

DDL

DDL is abbreviation of Data Definition Language. It is used to create and modify the structure of database objects in database.

Examples: CREATE, ALTER, DROP statements

DCL

DCL is abbreviation of Data Control Language. It is used to create roles, permissions, and referential integrity as well it is used to control access to database by securing it.

Examples: GRANT, REVOKE statements

TCL

TCL is abbreviation of Transactional Control Language. It is used to manage different transactions occurring within a database.

Examples: COMMIT, ROLLBACK statements

[My-Exp] Job Description for DBA/DB Developer

This JD is copied from one of the SQL Server Experts:

Job Description of Database Administrator (DBA) or Database Developer

Develop standards and guidelines to guide the use and acquisition of software and to protect vulnerable information.

Modify existing databases and database management systems or direct programmers and analysts to make changes.

Test programs or databases, correct errors and make necessary modifications.

Plan, coordinate and implement security measures to safeguard information in computer files against accidental or unauthorized damage, modification or disclosure.

Approve, schedule, plan, and supervise the installation and testing of new products and improvements to computer systems, such as the installation of new databases.

Train users and answer questions.

Establish and calculate optimum values for database parameters, using manuals and calculator.

Specify users and user access levels for each segment of database.

Develop data model describing data elements and how they are used, following procedures and using pen, template or computer software.

Develop methods for integrating different products so they work properly together, such as customizing commercial databases to fit specific needs.

Review project requests describing database user needs to estimate time and cost required to accomplish project.

Review procedures in database management system manuals for making changes to database.

Work as part of a project team to coordinate database development and determine project scope and limitations.

Select and enter codes to monitor database performance and to create production database.

Identify and evaluate industry trends in database systems to serve as a source of information and advice for upper management.

Write and code logical and physical database descriptions and specify identifiers of database to management system or direct others in coding descriptions.

Review workflow charts developed by programmer analyst to understand tasks computer will perform, such as updating records.

Revise company definition of data as defined in data dictionary.

[T-SQL] Create and Encrypted Stored Procedure

An answer to a very simple question , how to create a stored procedure any problem if we create it.

Here is the script

USE AdventureWorks
GO
-- Create Encyrpted SP
CREATE PROCEDURE uspEnc
WITH ENCRYPTION
AS
SELECT *
FROM Sales.SalesOrderDetail
UNION
SELECT *
FROM Sales.SalesOrderDetail
GO
-- Execute SP
EXEC uspEnc
GO

Question: if any stored procedure is encrypted can we see its definition in Activity Monitor.
Answer: No

Question: Any tool to decrypt the SP
Answer: SQL Decryptor, Decrypt SQL

Question: Should a DBA recommend it
Answer: Encrypted stored procedures are horrible. You can’t see what queries are executed, you don’t have a clue what the execution plan is. So if users are complaining about the slowness of the system, and everything is encrypted… then it will not be possible to look into it.

[T-SQL] Get Running Query from the Sessions

If you want to get the query running of one of the session then you can get from the query below , just answering that if you want to know the running sessions then you can use sp_who and sp_who2.

Before executing this statement please change the option from result to grid to result to text. For this you can use shortcut CTRL+T.

Here is the script, 55 can be replaced from your desired session id.

DECLARE @sqltext VARBINARY(128)
SELECT @sqltext = sql_handle
FROM sys.sysprocesses
WHERE spid = 55
SELECT TEXT
FROM sys.dm_exec_sql_text(@sqltext)
GO

[T-SQL] Get Numbers from Alpha Numeric Value

Here is the procedure i just saw from one of the sql expert blog that will rreturn Numbers from Alpha Numeric Values.

CREATE PROCEDURE [dbo].[GetNumbers]
@alpha VARCHAR(50),
@decimal DECIMAL(14, 5) OUTPUT
AS BEGIN
SET NOCOUNT ON;
DECLARE @ErrorMsg VARCHAR(50)
DECLARE @Pos INT
DECLARE @CommaPos INT
DECLARE @ZeroExists INT
DECLARE @alphaReverse VARCHAR(50)
DECLARE @NumPos INT
DECLARE @Len INT
-- 1 Reverse the alpha in order to get the last position of a numeric value
SET @alphaReverse = REVERSE(@alpha)
-- 2 Get the last position of a numeric figure
SET @NumPos = PATINDEX('%[0-9]%', @alphaReverse)
-- 3 Get the lenght of the string
SET @Len = LEN(@alpha)
-- 4 Add a comma after the numeric data in case it's no decimal number
SET @alpha = SUBSTRING(@alpha, 1, (@Len - @NumPos + 1))
+ ','
+ SUBSTRING(@alpha, (@Len - @NumPos + 2), 50)
-- Check if there is a zero (0) in the @alpha, then we later set the @decimal to 0
-- if it's 0 after the handling, else we set @decimal to NULL
-- If 0 no match, else there is a match
SET @ZeroExists = CHARINDEX ( '0' , @alpha ,1 )
-- Find position of , (comma)
SET @CommaPos = 1
SET @CommaPos = PATINDEX('%,%', @alpha)
IF (@CommaPos = '') BEGIN
SET @CommaPos = 20
END
SET @Pos = PATINDEX('%[^0-9]%',@alpha)
-- Replaces any aplha with '0' since we otherwice can't keep track of where the decimal
-- should be put in. We assume the numeric number has no aplhe inside. The regular way
-- to solve this is to replace with ”, but then we miss the way to find the place to
-- put in the decimal.
WHILE (@Pos > 0) BEGIN
SET @alpha = STUFF(@alpha, @pos, 1, '0')
SET @Pos = PATINDEX('%[^0-9]%',@alpha)
END
IF (@alpha IS NOT NULL AND @alpha != '') BEGIN
SET @decimal = CONVERT(DECIMAL(14, 5), SUBSTRING(@alpha, 1, (@CommaPos - 1))
+ '.'
+ SUBSTRING(@alpha, (@CommaPos + 1), 20))
END
-- Since we in this case don't want to set 0 if where is no numeric value, we set NULL to be safe
IF (@decimal = 0 AND @ZeroExists = 0) BEGIN
SET @decimal = NULL
END
END
GO

If you run above SP as shown below it will work
DECLARE @myVal DECIMAL(14,5)
EXEC [GetNumbers] 'ABC455,88ghf', @myVal OUTPUT
SELECT @myVal RValue

RValue
---------------------------------------
455.88000

Friday, October 8, 2010

[My-Exp] Query Optimized

Few days back i blogged that my manager said to me to optimize an sp , so here is the story for this:

The sp was not running for one day data even in 45 mins, but now its running in 4 mins for one month data. So here are the tips of what i did.

1. Added "with recompile" in sp.
2. Added "SET NOCOUNT ON;  "
3. Applied Parameter sniffing
4. Created two Non clustered indexes on one of the table.

Thursday, October 7, 2010

[T-SQL] Clear Cache and Buffer of Stored Procedure

Use DBCC FREEPROCCACHE to clear the procedure cache. Freeing the procedure cache would cause, for example, an ad-hoc SQL statement to be recompiled rather than reused from the cache. If observing through SQL Profiler, one can watch the Cache Remove events occur as DBCC FREEPROCCACHE goes to work. DBCC FREEPROCCACHE will invalidate all stored procedure plans that the optimizer has cached in memory and force SQL Server to compile new plans the next time those procedures are run.

Use DBCC DROPCLEANBUFFERS to test queries with a cold buffer cache without shutting down and restarting the server. DBCC DROPCLEANBUFFERS serves to empty the data cache. Any data loaded into the buffer cache due to the prior execution of a query is removed.

DBCC FREEPROCCACHE
DBCC DROPCLEANBUFFERS

Wednesday, October 6, 2010

[T-SQL] Index rebuild or reorganize , which one to use ?

Index Rebuild : This process drops the existing Index and Recreates the index.
USE AdventureWorks;
GO
ALTER INDEX ALL ON Production.Product REBUILD
GO

Index Reorganize : This process physically reorganizes the leaf nodes of the index.
USE AdventureWorks;
GO
ALTER INDEX ALL ON Production.Product REORGANIZE
GO

Recommendation: Index should be rebuild when index fragmentation is great than 40%. Index should be reorganized when index fragmentation is between 10% to 40%. Index rebuilding process uses more CPU and it locks the database resources. SQL Server development version and Enterprise version has option ONLINE, which can be turned on when Index is rebuilt. ONLINE option will keep index available during the rebuilding.

[T-SQL] How to find index fragmentation

This script will tell you the index fragmentation stats:

SELECT ps.database_id, ps.OBJECT_ID,
ps.index_id, b.name,
ps.avg_fragmentation_in_percent
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, NULL) AS ps
INNER JOIN sys.indexes AS b ON ps.OBJECT_ID = b.OBJECT_ID
AND ps.index_id = b.index_id
WHERE ps.database_id = DB_ID()
ORDER BY ps.OBJECT_ID

[T-SQL] Identify Blocking Query / Session

One of the most common problem that DBA faces is that one or many sessions get blocked by some sessions due to long execution or deadlock. I have many times faced this problem and also read the same problem which is faced by many DBAs every day.

Here is the query by which you can kill the blocking session:
Kill 3 ( 3 is the blocking session )

But now your question would be how to identify the blocking session , so here we go...

SELECT


db.name DBName,

tl.request_session_id,

wt.blocking_session_id,

OBJECT_NAME(p.OBJECT_ID) BlockedObjectName,

tl.resource_type,

h1.TEXT AS RequestingText,

h2.TEXT AS BlockingTest,

tl.request_mode

FROM sys.dm_tran_locks AS tl

INNER JOIN sys.databases db ON db.database_id = tl.resource_database_id

INNER JOIN sys.dm_os_waiting_tasks AS wt ON tl.lock_owner_address = wt.resource_address

INNER JOIN sys.partitions AS p ON p.hobt_id = tl.resource_associated_entity_id

INNER JOIN sys.dm_exec_connections ec1 ON ec1.session_id = tl.request_session_id

INNER JOIN sys.dm_exec_connections ec2 ON ec2.session_id = wt.blocking_session_id

CROSS APPLY sys.dm_exec_sql_text(ec1.most_recent_sql_handle) AS h1

CROSS APPLY sys.dm_exec_sql_text(ec2.most_recent_sql_handle) AS h2

GO

Tuesday, October 5, 2010

[T-SQL] Optimize Stored Procedure

Today in morning when i came , i was having an email of my boss telling that there is an high priority issue which need to solved on immediate basis. And it was to optimize sp as it was creating a huge problem. I am currently working on it but here are some tips to get the job done:

1. Include SET NOCOUNT ON statement:

With every SELECT and DML statement, the SQL server returns a message that indicates the number of affected rows by that statement. This information is mostly helpful in debugging the code, but it is useless after that. By setting SET NOCOUNT ON, we can disable the feature of returning this extra information. For stored procedures that contain several statements or contain Transact-SQL loops, setting SET NOCOUNT to ON can provide a significant performance boost because network traffic is greatly reduced.
CREATE PROC dbo.ProcName
AS
SET NOCOUNT ON;
--Procedure code here
SELECT column1 FROM dbo.TblTable1
-- Reset SET NOCOUNT to OFF
SET NOCOUNT OFF;
GO

2. Use schema name with object name:

The object name is qualified if used with schema name. Schema name should be used with the stored procedure name and with all objects referenced inside the stored procedure. This help in directly finding the complied plan instead of searching the objects in other possible schema before finally deciding to use a cached plan, if available. This process of searching and deciding a schema for an object leads to COMPILE lock on stored procedure and decreases the stored procedure’s performance. Therefore, always refer the objects with qualified name in the stored procedure like
SELECT * FROM dbo.MyTable -- Preferred method
-- Instead of
SELECT * FROM MyTable -- Avoid this method
--And finally call the stored procedure with qualified name like:
EXEC dbo.MyProc -- Preferred method
--Instead of
EXEC MyProc -- Avoid this method

3. Do not use the prefix “sp_” in the stored procedure name:

If a stored procedure name begins with “SP_,” then SQL server first searches in the master database and then in the current session database. Searching in the master database causes extra overhead and even a wrong result if another stored procedure with the same name is found in master database.
Use IF EXISTS (SELECT 1) instead of (SELECT *): To check the existence of a record in another table, we uses the IF EXISTS clause. The IF EXISTS clause returns True if any value is returned from an internal statement, either a single value “1” or all columns of a record or complete recordset. The output of the internal statement is not used. Hence, to minimize the data for processing and network transferring, we should use “1” in the SELECT clause of an internal statement, as shown below:
IF EXISTS (SELECT 1 FROM sysobjects
WHERE name = 'MyTable' AND type = 'U')
Use the sp_executesql stored procedure instead of the EXECUTE statement.
The sp_executesql stored procedure supports parameters. So, using the sp_executesql stored procedure instead of the EXECUTE statement improve the re-usability of your code. The execution plan of a dynamic statement can be reused only if each and every character, including case, space, comments and parameter, is same for two statements. For example, if we execute the below batch:
DECLARE @Query VARCHAR(100)
DECLARE @Age INT
SET @Age = 25
SET @Query = 'SELECT * FROM dbo.tblPerson WHERE Age = ' + CONVERT(VARCHAR(3),@Age)
EXEC (@Query)

If we again execute the above batch using different @Age value, then the execution plan for SELECT statement created for @Age =25 would not be reused. However, if we write the above batch as given below,

DECLARE @Query NVARCHAR(100)
SET @Query = N'SELECT * FROM dbo.tblPerson WHERE Age = @Age'
EXECUTE sp_executesql @Query, N'@Age int', @Age = 25

the compiled plan of this SELECT statement will be reused for different value of @Age parameter. The reuse of the existing complied plan will result in improved performance.

4. Try to avoid using SQL Server cursors whenever possible:

Cursor uses a lot of resources for overhead processing to maintain current record position in a recordset and this decreases the performance. If we need to process records one-by-one in a loop, then we should use the WHILE clause. Wherever possible, we should replace the cursor-based approach with SET-based approach. Because the SQL Server engine is designed and optimized to perform SET-based operation very fast. Again, please note cursor is also a kind of WHILE Loop.
Keep the Transaction as short as possible: The length of transaction affects blocking and deadlocking. Exclusive lock is not released until the end of transaction. In higher isolation level, the shared locks are also aged with transaction. Therefore, lengthy transaction means locks for longer time and locks for longer time turns into blocking. In some cases, blocking also converts into deadlocks. So, for faster execution and less blocking, the transaction should be kept as short as possible.

5. Use TRY-Catch for error handling:

Prior to SQL server 2005 version code for error handling, there was a big portion of actual code because an error check statement was written after every t-sql statement. More code always consumes more resources and time. In SQL Server 2005, a new simple way is introduced for the same purpose. The syntax is as follows:
BEGIN TRY
--Your t-sql code goes here
END TRY
BEGIN CATCH
--Your error handling code goes here
END CATCH

[T-SQL] Restore database trough mdf file only , ldf missing

One of my friend just call me and told that he has an mdf file but no ldf file , so if want to restore db then what to do , i searched over internet and found two fastest methods for it

USE [master]
GO
-- Method 1: I use this method
EXEC sp_attach_single_file_db @dbname='TestDb',
@physname=N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf'
GO
-- Method 2:
CREATE DATABASE TestDb ON
(FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf')
FOR ATTACH_REBUILD_LOG
GO

[T-SQL] Kill All Process in Database

I was just going trough one query and find out that it was giving one error , but its really useful script for killing all process. One thing just be sure that you are not killing the process from the db instance from where you are running the query.

CREATE TABLE #TmpWho
(spid INT, ecid INT, status VARCHAR(150), loginame VARCHAR(150),
hostname VARCHAR(150), blk INT, dbname VARCHAR(150), cmd VARCHAR(150), rquest_id INT)
INSERT INTO #TmpWho
EXEC sp_who
DECLARE @spid INT
DECLARE @tString VARCHAR(15)
DECLARE @getspid CURSOR
SET @getspid = CURSOR FOR
SELECT spid
FROM #TmpWho
WHERE dbname = 'mydb'OPEN @getspid
FETCH NEXT FROM @getspid INTO @spid
WHILE @@FETCH_STATUS = 0
BEGIN
SET @tString = 'KILL ' + CAST(@spid AS VARCHAR(5))
EXEC(@tString)
FETCH NEXT FROM @getspid INTO @spid
END
CLOSE @getspid
DEALLOCATE @getspid
DROP TABLE #TmpWho
GO

Monday, October 4, 2010

[T-SQL] Renaming the Physical Filename for Datafiles in SQL Server 2008

I just got a question in morning , that how can one rename the physical filename for data files in sql server 2008. So here is the link and then some of the methods which ever suits you

http://technet.microsoft.com/en-us/library/ms174269.aspx

There are several ways to make this change, however to rename the physical database files at operating system level you will have to take the database offline

1. Use SSMS to take the database Offline (right-click on Database, select Tasks, Take Offline), change the name of the files at the OS level and then Bring it Online.

2. You could Detach the database, rename the files and then Attach the database pointing to the renamed files to do so.

3. You could Backup the database and then restore, changing the file location during the restore process.

4. using T SQL

ALTER DATABASE databaseName SET OFFLINE
GO


ALTER DATABASE databaseNAme MODIFY FILE (NAME =db, FILENAME = 'C:\Program
Files\Microsoft SQL Server\MSSQL.2\MSSQL\Data\db.mdf')
GO

--if changing log file name
ALTER DATABASE databaseNAme MODIFY FILE (NAME = db_log, FILENAME =
'C:\Program Files\Microsoft SQL Server\MSSQL.2\MSSQL\Data\db.ldf')
GO

ALTER DATABASE databaseName SET ONLINE
GO

Thursday, September 30, 2010

Differnces between a RDBMS schema and a data warehouse schema

RDBMS Schema
* Main Purpose: Record
* Used for OLTP systems
* Highly Normalized
* Difficult to understand and navigate
* Difficult to extract and solve complex problems

DWH Schema
* Main Purpose: Respond
* Used for OLAP systems
* De-normalized
* Easy to understand and navigate
* Relatively easier in extracting the data and solving complex problems

Differences between OLTP and OLAP

Main Differences between OLTP and OLAP are:-
1. User and System Orientation
OLTP: customer-oriented, used for data analysis and querying by clerks, clients and IT professionals.
OLAP: market-oriented, used for data analysis by knowledge workers( managers, executives, analysis).
2. Data Contents
OLTP: manages current data, very detail-oriented.
OLAP: manages large amounts of historical data, provides facilities for summarization and aggregation, stores information at different levels of granularity to support decision making process.
3. Database Design
OLTP: adopts an entity relationship(ER) model and an application-oriented database design.
OLAP: adopts star, snowflake or fact constellation model and a subject-oriented database design.
4. View
OLTP: focuses on the current data within an enterprise or department.
OLAP: spans multiple versions of a database schema due to the evolutionary process of an organization; integrates information from many organizational locations and data stores

Monday, September 27, 2010

Difference between Database and Data Warehouse

I was just searching on net for basic difference between the two of my core expertise and one of the best extract I found is pasted below:

So how is a data warehouse different from you regular database? After all, both are databases, and both have some tables containing data. If you look deeper, you'd find that both have indexes, keys, views, and the regular jing-bang. So is that 'Data warehouse' really different from the tables in you application? And if the two aren't really different, maybe you can just run your queries and reports directly from your application databases!

Well, to be fair, that may be just what you are doing right now, running some EOD (end-of-day) reports as complex SQL queries and shipping them off to those who need them. And this scheme might just be serving you fine right now. Nothing wrong with that if it works for you.

But before you start patting yourself on the back for having avoided a data warehouse altogether, do spend a moment to understand the differences, and to appreciate the pros and cons of either approach.

The primary difference betwen you application database and a data warehouse is that while the former is designed (and optimized) to record , the latter has to be designed (and optimized) to respond to analysis questions that are critical for your business.

Application databases are OLTP (On-Line Transaction Processing) systems where every transaction has to be recorded, and super-fast at that. Consider the scenario where a bank ATM has disbursed cash to a customer but was unable to record this event in the bank records. If this started happening frequently, the bank wouldn't stay in business for too long. So the banking system is designed to make sure that every trasaction gets recorded within the time you stand before the ATM machine. This system is write-optimized, and you shouldn't crib if your analysis query (read operation) takes a lot of time on such a system.

A Data Warehouse (DW) on the other end, is a database (yes, you are right, it's a database) that is designed for facilitating querying and analysis. Often designed as OLAP (On-Line Analytical Processing) systems, these databases contain read-only data that can be queried and analysed far more efficiently as compared to your regular OLTP application databases. In this sense an OLAP system is designed to be read-optimized.

Separation from your application database also ensures that your business intelligence solution is scalable (your bank and ATMs don't go down just because the CFO asked for a report), better documented and managed (god help the novice who is given the application database diagrams and asked to locate the needle of data in the proverbial haystack of table proliferation), and can answer questions far more efficietly and frequently.

Creation of a DW leads to a direct increase in quality of analyses as the table structures are simpler (you keep only the needed information in simpler tables), standardized (well-documented table structures), and often denormalized (to reduce the linkages between tables and the corresponding complexity of queries). A DW drastically reduces the 'cost-per-analysis' and thus permits more analysis per FTE. Having a well-designed DW is the foundation successful BI/Analytics initiatives are built upon.

Sunday, September 26, 2010

[T-SQL] How to Stop Growing Log File Too Big

In continuation of my last post "http://muhammadabbas.blogspot.com/2010/09/t-sql-truncate-log-file.html" , i am writing this article.

Today i was reading an article and this article highlighted that how could we stop the log file to grow. Here is the text:

Two different solutions for it:

1) Convert the Recovery Model to Simple Recovery

If you are truncating the transaction logs, this means you are breaking the T-Log LSN (Log Sequence Numbers). This follows that if disaster comes, you would not be able to restore your T-Logs and there would be no option for you to do point in time recovery. If you are fine with this situation and there is nothing to worry, I suggest that you change your recovery model to Simple Recovery Model. This way, you will not have extra ordinary growth of your log file.

2) Start Taking Transaction Log Backup

If your business does not support loss of data or requires having point in time recovery, you cannot afford anything less than Full Recovery Model. In Full Recovery Model, your transaction log will grow until you take a backup of it. You need to take the T-Log Backup at a regular interval. This way, your log would not grow beyond some limits. If you are taking an hourly T-Log backup, your T-Log would grow until one hour but after this the T-Log backup would truncate all the ‘committed’ transactions once you take it. Doing this would lead the size of the T-Log not to go down much, but it would rather be marked as empty for the next hour’s T-Log to populate.

With this method, you can restore your database at Point of Time if a disaster ever happens at your server.

Let us run an example to demonstrate this. In this case, I have done the following steps:

1.Create Sample Database in FULL RECOVERY Model
2.Take Full Backup (full backup is must for taking subsequent backup)
3.Repeat Following Operation
1.Take Log Backup
2.Insert Some rows
3.Check the size of Log File
4.Clean Up
After a short while, you will notice that the Log file (ldf) will stop increasing but the size of the backup will increase.

If you have an issue with your log file growth, I suggest that you follow either of the above solutions instead of truncating it.

/* FULL Recovery and Log File Growth */
USE [master]
GO
-- Create Database SimpleTran
IF EXISTS (SELECT name FROM sys.databases WHERE name = N'SimpleTran')
BEGIN
ALTER DATABASE [SimpleTran] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
DROP DATABASE [SimpleTran]
END
GO
CREATE DATABASE [SimpleTran]
GO
-- Set Database backup model to FULL
ALTER DATABASE [SimpleTran] SET RECOVERY FULL
GO
BACKUP DATABASE [SimpleTran] TO DISK = N'D:\SimpleTran.bak' WITH NOFORMAT, NOINIT, NAME = N'SimpleTran-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10
GO
-- Check Database Log File Size
SELECT DB_NAME(database_id) AS DatabaseName,
Name AS Logical_Name,
Physical_Name, (size*8)/1024 SizeMB
FROM sys.master_files
WHERE DB_NAME(database_id) = 'SimpleTran'
GO
-- Create Table in Database with Transaction
USE SimpleTran
GO
IF EXISTS (SELECT * FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'[dbo].[RealTempTable]') AND TYPE IN (N'U'))
DROP TABLE [dbo].[RealTempTable]
GO
CREATE TABLE RealTempTable (ID INT)
INSERT INTO RealTempTable (ID)
SELECT TOP 50000 ROW_NUMBER() OVER (ORDER BY a.name) RowID
FROM sys.all_objects a
CROSS JOIN sys.all_objects b
GO
-- Check the size of the Database
SELECT DB_NAME(database_id) AS DatabaseName,
Name AS Logical_Name,
Physical_Name, (size*8)/1024 SizeMB
FROM sys.master_files
WHERE DB_NAME(database_id) = 'SimpleTran'
GO
-- Take Full Backup
BACKUP LOG [SimpleTran] TO DISK = N'D:\SimpleTran.bak' WITH NOFORMAT, NOINIT, NAME = N'SimpleTran-Transaction Log Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10
GO
-- Run following transaction multiple times and check the size of T-Log
INSERT INTO RealTempTable (ID)
SELECT TOP 50000 ROW_NUMBER() OVER (ORDER BY a.name) RowID
FROM sys.all_objects a
CROSS JOIN sys.all_objects b
GO
-- Check the size of the Database
SELECT DB_NAME(database_id) AS DatabaseName,
Name AS Logical_Name,
Physical_Name, (size*8)/1024 SizeMB
FROM sys.master_files
WHERE DB_NAME(database_id) = 'SimpleTran'
GO
/*
Now run following code multiple times.
You will notice that it will not increase the size of .ldf file but will for sure
increasethe size of the log backup.
*/
-- Second Time
-- START
BACKUP LOG [SimpleTran] TO DISK = N'D:\SimpleTran.log' WITH NOFORMAT, NOINIT, NAME = N'SimpleTran-Transaction Log Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10
GO
-- Run following transaction and check the size of T-Log
INSERT INTO RealTempTable (ID)
SELECT TOP 50000 ROW_NUMBER() OVER (ORDER BY a.name) RowID
FROM sys.all_objects a
CROSS JOIN sys.all_objects b
GO
-- Check the size of the Database
SELECT DB_NAME(database_id) AS DatabaseName,
Name AS Logical_Name,
Physical_Name, (size*8)/1024 SizeMB
FROM sys.master_files
WHERE DB_NAME(database_id) = 'SimpleTran'
GO
-- END
--Clean Up
USE MASTER
GO
IF EXISTS (SELECT name FROM sys.databases WHERE name = N'SimpleTran')
BEGIN
ALTER DATABASE [SimpleTran] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;
DROP DATABASE [SimpleTran]
END

[T-SQL] Truncate Log File

Few days back , i was called by my friend who was working on community database with a problem that log file of his db was growing bigger and he was worried about it. So i went there and saw that its not a big issue and told him that i would prepare a script and give him so that he could run that once in a month:

USE TEST;
GO
-- Truncate the log by changing the database recovery model to SIMPLE.
ALTER DATABASE TEST
SET RECOVERY SIMPLE;
GO
-- Shrink the truncated log file to 1 MB.
DBCC SHRINKFILE (TEST_log, 1);
GO
-- Reset the database recovery model.
ALTER DATABASE TEST
SET RECOVERY FULL;
GO

Thursday, September 23, 2010

[T-SQL] Generate Delete script for all tables

You can use this script for generate delete script or even trncate

select 'delete from ' + o.name + ';'
from sysobjects o
where o.type = 'U';

Wednesday, September 22, 2010

[T-SQL] Delete all data in database (when you have FKs)

SET NOCOUNT ON
GO

SELECT 'USE [' + db_name() +']';

SELECT 'ALTER TABLE ' +
'[' + s.name + '].[' + t.name + ']' +
' DROP CONSTRAINT [' + f.name +']'
FROM sys.foreign_keys f
INNER JOIN sys.TABLES t ON f.parent_object_id=t.object_id
INNER JOIN sys.schemas s ON t.schema_id=s.schema_id
WHERE t.is_ms_shipped=0;


SELECT 'TRUNCATE TABLE ' + '[' + s.name + '].[' + t.name + ']'
FROM sys.TABLES t
INNER JOIN sys.schemas s ON t.schema_id=s.schema_id
WHERE t.is_ms_shipped=0;


SELECT 'ALTER TABLE ' +
'[' + s.name + '].[' + t.name + ']' +
' ADD CONSTRAINT [' + f.name + ']' +
' FOREIGN KEY (' +
Stuff( (SELECT ', ['+col_name(fk.parent_object_id, fk.parent_column_id) +']'
FROM sys.foreign_key_columns fk
WHERE constraint_object_id = f.object_id
ORDER BY constraint_column_id
FOR XML Path('')
), 1,2,'') + ')' +
' REFERENCES [' +
object_schema_name(f.referenced_object_id)+'].['+object_name(f.referenced_object_id) + '] (' +
Stuff((SELECT ', ['+col_name(fc.referenced_object_id, fc.referenced_column_id)+']'
FROM sys.foreign_key_columns fc
WHERE constraint_object_id = f.object_id
ORDER BY constraint_column_id
FOR XML Path('')),
1,2,'') +
')' +
' ON DELETE ' + REPLACE(f.delete_referential_action_desc, '_', ' ') +
' ON UPDATE ' + REPLACE(f.update_referential_action_desc , '_', ' ') collate database_default
FROM sys.foreign_keys f
INNER JOIN sys.TABLES t ON f.parent_object_id=t.object_id
INNER JOIN sys.schemas s ON t.schema_id=s.schema_id
WHERE t.is_ms_shipped=0;

Monday, September 20, 2010

[T-SQL] Row Count in Tables

Below is the query trough which you can get row counts from all the tables:

SELECT sc.name +'.'+ ta.name TableName
,SUM(pa.rows) RowCnt
FROM sys.tables ta
INNER JOIN sys.partitions pa
ON pa.OBJECT_ID = ta.OBJECT_ID
INNER JOIN sys.schemas sc
ON ta.schema_id = sc.schema_id
WHERE ta.is_ms_shipped = 0 AND pa.index_id IN (1,0)
GROUP BY sc.name,ta.name
ORDER BY SUM(pa.rows) DESC

[T-SQL] Find Auto Created Stats and Delete query for them

Query for getting them:

SELECT name, OBJECT_NAME(OBJECT_ID)
FROM sys.stats
WHERE auto_created = 1;

Delete query for one of the table stats:

DROP STATISTICS TableName.StatisticsName

Thursday, September 16, 2010

[Personal] Blessed by a baby boy

On 23rd Ramazan , i was blessed by a baby boy. Too cuteeeee.

His grand father has kept his name "Ali Imran".

[T-SQL] Transport Level Error

This is the error which i encountered now a days very much:

A transport-level error has occurred when sending the request to the
server.

Here is the reason for it:

You used to have an open connection to a server but you don’t any more. It was closed or killed from the server side. Maybe the server rebooted (or failed-over to another cluster node). You’re talking into one tin can, but nobody’s on the tin can on the other end of the string.

Annoying, but not a problem… open a new connection and try again. If you see this error message from inside SQL Server Management Studio, just hit F5 again (or Alt-X or Ctrl-E depending on preference).

[T-SQL] Very Nice Article Regarding T-SQL

A very nice article regarding Dont's of T-SQL:

http://michaeljswart.com/?p=823

Monday, August 23, 2010

[T-SQL] Function to Retrieve First Word of Sentence

--This code/function can help to get you first word from a sentence.
--TEST SCRIPT
DECLARE @StringVar VARCHAR(100)
SET @StringVar = ' anything '
SELECT CASE CHARINDEX(' ', LTRIM(@StringVar), 1)
WHEN 0 THEN LTRIM(@StringVar)
ELSE SUBSTRING(LTRIM(@StringVar), 1, CHARINDEX(' ',LTRIM(@StringVar), 1) - 1)
END FirstWordofSentence

--FUNCTION CODE
CREATE FUNCTION GetFirstWord (@StringVar VARCHAR(MAX))
RETURNS VARCHAR(MAX)
AS
BEGIN
DECLARE @rtStringVar VARCHAR(MAX)
SELECT @rtStringVar = CASE CHARINDEX(' ', LTRIM(@StringVar), 1)
WHEN 0 THEN LTRIM(@StringVar)
ELSE SUBSTRING(LTRIM(@StringVar), 1, CHARINDEX(' ',LTRIM(@StringVar), 1) - 1)
END
RETURN @rtStringVar
END

Wednesday, August 18, 2010

[T-SQL] To concatenate the values of a column in all rows into one row.

USE pubs
GO
DECLARE @title_ids varchar(150), @delimiter char
SET @delimiter = ','
SELECT @title_ids = COALESCE(@title_ids + @delimiter, '') + title_id FROM titles
SELECT @title_ids AS [List of Title IDs]

[T-SQL] To convert a given string to proper case

CREATE FUNCTION PROPERCASE
(
--The string to be converted to proper case
@input varchar(8000)
)
--This function returns the proper case string of varchar type
RETURNS varchar(8000)
AS
BEGIN
IF @input IS NULL
BEGIN
--Just return NULL if input string is NULL
RETURN NULL
END

--Character variable declarations
DECLARE @output varchar(8000)
--Integer variable declarations
DECLARE @ctr int, @len int, @found_at int
--Constant declarations
DECLARE @LOWER_CASE_a int, @LOWER_CASE_z int, @Delimiter char(3), @UPPER_CASE_A int, @UPPER_CASE_Z int

--Variable/Constant initializations
SET @ctr = 1
SET @len = LEN(@input)
SET @output = ''
SET @LOWER_CASE_a = 97
SET @LOWER_CASE_z = 122
SET @Delimiter = ' ,-'
SET @UPPER_CASE_A = 65
SET @UPPER_CASE_Z = 90

WHILE @ctr <= @len BEGIN --This loop will take care of reccuring white spaces WHILE CHARINDEX(SUBSTRING(@input,@ctr,1), @Delimiter) > 0
BEGIN
SET @output = @output + SUBSTRING(@input,@ctr,1)
SET @ctr = @ctr + 1
END

IF ASCII(SUBSTRING(@input,@ctr,1)) BETWEEN @LOWER_CASE_a AND @LOWER_CASE_z
BEGIN
--Converting the first character to upper case
SET @output = @output + UPPER(SUBSTRING(@input,@ctr,1))
END
ELSE
BEGIN
SET @output = @output + SUBSTRING(@input,@ctr,1)
END

SET @ctr = @ctr + 1

WHILE CHARINDEX(SUBSTRING(@input,@ctr,1), @Delimiter) = 0 AND (@ctr <= @len)
BEGIN
IF ASCII(SUBSTRING(@input,@ctr,1)) BETWEEN @UPPER_CASE_A AND @UPPER_CASE_Z
BEGIN
SET @output = @output + LOWER(SUBSTRING(@input,@ctr,1))
END
ELSE
BEGIN
SET @output = @output + SUBSTRING(@input,@ctr,1)
END
SET @ctr = @ctr + 1
END

END
RETURN @output
END

[T-SQL] Get Biggest Table on Your Production DB

USE master
GO

IF EXISTS (SELECT 1 FROM sysobjects WHERE name = 'sp_show_huge_tables' AND type = 'P')
DROP PROC sp_show_huge_tables
GO

CREATE PROC sp_show_huge_tables
(
@top int = NULL,
@include_system_tables bit = 0
)
AS

BEGIN
IF @top > 0
SET ROWCOUNT @top

SELECT [Table Name], (SELECT rows FROM sysindexes s WHERE s.indid < 2 AND s.id = OBJECT_ID(a.[Table Name])) AS [Row count], [Total space used (MB)] FROM
(
SELECT QUOTENAME(USER_NAME(o.uid)) + '.' + QUOTENAME(OBJECT_NAME(i.id)) AS [Table Name],
CONVERT(numeric(15,2),(((CONVERT(numeric(15,2),SUM(i.reserved)) * (SELECT low FROM master.dbo.spt_values (NOLOCK) WHERE number = 1 AND type = 'E')) / 1024.)/1024.)) AS [Total space used (MB)]
FROM sysindexes i (NOLOCK)
INNER JOIN
sysobjects o (NOLOCK)
ON
i.id = o.id AND
((@include_system_tables = 1 AND o.type IN ('U', 'S')) OR o.type = 'U') AND
((@include_system_tables = 1)OR (OBJECTPROPERTY(i.id, 'IsMSShipped') = 0))
WHERE indid IN (0, 1, 255)
GROUP BY QUOTENAME(USER_NAME(o.uid)) + '.' + QUOTENAME(OBJECT_NAME(i.id))

) as a
ORDER BY [Total space used (MB)] DESC


SET ROWCOUNT 0
END

GO

GRANT EXEC ON sp_show_huge_tables TO Public

[T-SQL] Procedure to script your data (to generate INSERT statements from the existing data)

SET NOCOUNT ON
GO

PRINT 'Using Master database'
USE master
GO

PRINT 'Checking for the existence of this procedure'
IF (SELECT OBJECT_ID('sp_generate_inserts','P')) IS NOT NULL --means, the procedure already exists
BEGIN
PRINT 'Procedure already exists. So, dropping it'
DROP PROC sp_generate_inserts
END
GO

CREATE PROC sp_generate_inserts
(
@table_name varchar(776), -- The table/view for which the INSERT statements will be generated using the existing data
@target_table varchar(776) = NULL, -- Use this parameter to specify a different table name into which the data will be inserted
@include_column_list bit = 1, -- Use this parameter to include/ommit column list in the generated INSERT statement
@from varchar(800) = NULL, -- Use this parameter to filter the rows based on a filter condition (using WHERE)
@include_timestamp bit = 0, -- Specify 1 for this parameter, if you want to include the TIMESTAMP/ROWVERSION column's data in the INSERT statement
@debug_mode bit = 0, -- If @debug_mode is set to 1, the SQL statements constructed by this procedure will be printed for later examination
@owner varchar(64) = NULL, -- Use this parameter if you are not the owner of the table
@ommit_images bit = 0, -- Use this parameter to generate INSERT statements by omitting the 'image' columns
@ommit_identity bit = 0, -- Use this parameter to ommit the identity columns
@top int = NULL, -- Use this parameter to generate INSERT statements only for the TOP n rows
@cols_to_include varchar(8000) = NULL, -- List of columns to be included in the INSERT statement
@cols_to_exclude varchar(8000) = NULL, -- List of columns to be excluded from the INSERT statement
@disable_constraints bit = 0, -- When 1, disables foreign key constraints and enables them after the INSERT statements
@ommit_computed_cols bit = 0 -- When 1, computed columns will not be included in the INSERT statement

)
AS
BEGIN

/***********************************************************************************************************
Procedure: sp_generate_inserts (Build 22)
(Copyright © 2002 Narayana Vyas Kondreddi. All rights reserved.)

Purpose: To generate INSERT statements from existing data.
These INSERTS can be executed to regenerate the data at some other location.
This procedure is also useful to create a database setup, where in you can
script your data along with your table definitions.

Written by: Narayana Vyas Kondreddi
http://vyaskn.tripod.com

Acknowledgements:
Divya Kalra -- For beta testing
Mark Charsley -- For reporting a problem with scripting uniqueidentifier columns with NULL values
Artur Zeygman -- For helping me simplify a bit of code for handling non-dbo owned tables
Joris Laperre -- For reporting a regression bug in handling text/ntext columns

Tested on: SQL Server 7.0 and SQL Server 2000 and SQL Server 2005

Date created: January 17th 2001 21:52 GMT

Date modified: May 1st 2002 19:50 GMT

Email: vyaskn@hotmail.com

NOTE: This procedure may not work with tables with too many columns.
Results can be unpredictable with huge text columns or SQL Server 2000's sql_variant data types
Whenever possible, Use @include_column_list parameter to ommit column list in the INSERT statement, for better results
IMPORTANT: This procedure is not tested with internation data (Extended characters or Unicode). If needed
you might want to convert the datatypes of character variables in this procedure to their respective unicode counterparts
like nchar and nvarchar

ALSO NOTE THAT THIS PROCEDURE IS NOT UPDATED TO WORK WITH NEW DATA TYPES INTRODUCED IN SQL SERVER 2005 / YUKON


Example 1: To generate INSERT statements for table 'titles':

EXEC sp_generate_inserts 'titles'

Example 2: To ommit the column list in the INSERT statement: (Column list is included by default)
IMPORTANT: If you have too many columns, you are advised to ommit column list, as shown below,
to avoid erroneous results

EXEC sp_generate_inserts 'titles', @include_column_list = 0

Example 3: To generate INSERT statements for 'titlesCopy' table from 'titles' table:

EXEC sp_generate_inserts 'titles', 'titlesCopy'

Example 4: To generate INSERT statements for 'titles' table for only those titles
which contain the word 'Computer' in them:
NOTE: Do not complicate the FROM or WHERE clause here. It's assumed that you are good with T-SQL if you are using this parameter

EXEC sp_generate_inserts 'titles', @from = "from titles where title like '%Computer%'"

Example 5: To specify that you want to include TIMESTAMP column's data as well in the INSERT statement:
(By default TIMESTAMP column's data is not scripted)

EXEC sp_generate_inserts 'titles', @include_timestamp = 1

Example 6: To print the debug information:

EXEC sp_generate_inserts 'titles', @debug_mode = 1

Example 7: If you are not the owner of the table, use @owner parameter to specify the owner name
To use this option, you must have SELECT permissions on that table

EXEC sp_generate_inserts Nickstable, @owner = 'Nick'

Example 8: To generate INSERT statements for the rest of the columns excluding images
When using this otion, DO NOT set @include_column_list parameter to 0.

EXEC sp_generate_inserts imgtable, @ommit_images = 1

Example 9: To generate INSERT statements excluding (ommiting) IDENTITY columns:
(By default IDENTITY columns are included in the INSERT statement)

EXEC sp_generate_inserts mytable, @ommit_identity = 1

Example 10: To generate INSERT statements for the TOP 10 rows in the table:

EXEC sp_generate_inserts mytable, @top = 10

Example 11: To generate INSERT statements with only those columns you want:

EXEC sp_generate_inserts titles, @cols_to_include = "'title','title_id','au_id'"

Example 12: To generate INSERT statements by omitting certain columns:

EXEC sp_generate_inserts titles, @cols_to_exclude = "'title','title_id','au_id'"

Example 13: To avoid checking the foreign key constraints while loading data with INSERT statements:

EXEC sp_generate_inserts titles, @disable_constraints = 1

Example 14: To exclude computed columns from the INSERT statement:
EXEC sp_generate_inserts MyTable, @ommit_computed_cols = 1
***********************************************************************************************************/

SET NOCOUNT ON

--Making sure user only uses either @cols_to_include or @cols_to_exclude
IF ((@cols_to_include IS NOT NULL) AND (@cols_to_exclude IS NOT NULL))
BEGIN
RAISERROR('Use either @cols_to_include or @cols_to_exclude. Do not use both the parameters at once',16,1)
RETURN -1 --Failure. Reason: Both @cols_to_include and @cols_to_exclude parameters are specified
END

--Making sure the @cols_to_include and @cols_to_exclude parameters are receiving values in proper format
IF ((@cols_to_include IS NOT NULL) AND (PATINDEX('''%''',@cols_to_include) = 0))
BEGIN
RAISERROR('Invalid use of @cols_to_include property',16,1)
PRINT 'Specify column names surrounded by single quotes and separated by commas'
PRINT 'Eg: EXEC sp_generate_inserts titles, @cols_to_include = "''title_id'',''title''"'
RETURN -1 --Failure. Reason: Invalid use of @cols_to_include property
END

IF ((@cols_to_exclude IS NOT NULL) AND (PATINDEX('''%''',@cols_to_exclude) = 0))
BEGIN
RAISERROR('Invalid use of @cols_to_exclude property',16,1)
PRINT 'Specify column names surrounded by single quotes and separated by commas'
PRINT 'Eg: EXEC sp_generate_inserts titles, @cols_to_exclude = "''title_id'',''title''"'
RETURN -1 --Failure. Reason: Invalid use of @cols_to_exclude property
END


--Checking to see if the database name is specified along wih the table name
--Your database context should be local to the table for which you want to generate INSERT statements
--specifying the database name is not allowed
IF (PARSENAME(@table_name,3)) IS NOT NULL
BEGIN
RAISERROR('Do not specify the database name. Be in the required database and just specify the table name.',16,1)
RETURN -1 --Failure. Reason: Database name is specified along with the table name, which is not allowed
END

--Checking for the existence of 'user table' or 'view'
--This procedure is not written to work on system tables
--To script the data in system tables, just create a view on the system tables and script the view instead

IF @owner IS NULL
BEGIN
IF ((OBJECT_ID(@table_name,'U') IS NULL) AND (OBJECT_ID(@table_name,'V') IS NULL))
BEGIN
RAISERROR('User table or view not found.',16,1)
PRINT 'You may see this error, if you are not the owner of this table or view. In that case use @owner parameter to specify the owner name.'
PRINT 'Make sure you have SELECT permission on that table or view.'
RETURN -1 --Failure. Reason: There is no user table or view with this name
END
END
ELSE
BEGIN
IF NOT EXISTS (SELECT 1 FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = @table_name AND (TABLE_TYPE = 'BASE TABLE' OR TABLE_TYPE = 'VIEW') AND TABLE_SCHEMA = @owner)
BEGIN
RAISERROR('User table or view not found.',16,1)
PRINT 'You may see this error, if you are not the owner of this table. In that case use @owner parameter to specify the owner name.'
PRINT 'Make sure you have SELECT permission on that table or view.'
RETURN -1 --Failure. Reason: There is no user table or view with this name
END
END

--Variable declarations
DECLARE @Column_ID int,
@Column_List varchar(8000),
@Column_Name varchar(128),
@Start_Insert varchar(786),
@Data_Type varchar(128),
@Actual_Values varchar(8000), --This is the string that will be finally executed to generate INSERT statements
@IDN varchar(128) --Will contain the IDENTITY column's name in the table

--Variable Initialization
SET @IDN = ''
SET @Column_ID = 0
SET @Column_Name = ''
SET @Column_List = ''
SET @Actual_Values = ''

IF @owner IS NULL
BEGIN
SET @Start_Insert = 'INSERT INTO ' + '[' + RTRIM(COALESCE(@target_table,@table_name)) + ']'
END
ELSE
BEGIN
SET @Start_Insert = 'INSERT ' + '[' + LTRIM(RTRIM(@owner)) + '].' + '[' + RTRIM(COALESCE(@target_table,@table_name)) + ']'
END


--To get the first column's ID

SELECT @Column_ID = MIN(ORDINAL_POSITION)
FROM INFORMATION_SCHEMA.COLUMNS (NOLOCK)
WHERE TABLE_NAME = @table_name AND
(@owner IS NULL OR TABLE_SCHEMA = @owner)



--Loop through all the columns of the table, to get the column names and their data types
WHILE @Column_ID IS NOT NULL
BEGIN
SELECT @Column_Name = QUOTENAME(COLUMN_NAME),
@Data_Type = DATA_TYPE
FROM INFORMATION_SCHEMA.COLUMNS (NOLOCK)
WHERE ORDINAL_POSITION = @Column_ID AND
TABLE_NAME = @table_name AND
(@owner IS NULL OR TABLE_SCHEMA = @owner)



IF @cols_to_include IS NOT NULL --Selecting only user specified columns
BEGIN
IF CHARINDEX( '''' + SUBSTRING(@Column_Name,2,LEN(@Column_Name)-2) + '''',@cols_to_include) = 0
BEGIN
GOTO SKIP_LOOP
END
END

IF @cols_to_exclude IS NOT NULL --Selecting only user specified columns
BEGIN
IF CHARINDEX( '''' + SUBSTRING(@Column_Name,2,LEN(@Column_Name)-2) + '''',@cols_to_exclude) <> 0
BEGIN
GOTO SKIP_LOOP
END
END

--Making sure to output SET IDENTITY_INSERT ON/OFF in case the table has an IDENTITY column
IF (SELECT COLUMNPROPERTY( OBJECT_ID(QUOTENAME(COALESCE(@owner,USER_NAME())) + '.' + @table_name),SUBSTRING(@Column_Name,2,LEN(@Column_Name) - 2),'IsIdentity')) = 1
BEGIN
IF @ommit_identity = 0 --Determing whether to include or exclude the IDENTITY column
SET @IDN = @Column_Name
ELSE
GOTO SKIP_LOOP
END

--Making sure whether to output computed columns or not
IF @ommit_computed_cols = 1
BEGIN
IF (SELECT COLUMNPROPERTY( OBJECT_ID(QUOTENAME(COALESCE(@owner,USER_NAME())) + '.' + @table_name),SUBSTRING(@Column_Name,2,LEN(@Column_Name) - 2),'IsComputed')) = 1
BEGIN
GOTO SKIP_LOOP
END
END

--Tables with columns of IMAGE data type are not supported for obvious reasons
IF(@Data_Type in ('image'))
BEGIN
IF (@ommit_images = 0)
BEGIN
RAISERROR('Tables with image columns are not supported.',16,1)
PRINT 'Use @ommit_images = 1 parameter to generate INSERTs for the rest of the columns.'
PRINT 'DO NOT ommit Column List in the INSERT statements. If you ommit column list using @include_column_list=0, the generated INSERTs will fail.'
RETURN -1 --Failure. Reason: There is a column with image data type
END
ELSE
BEGIN
GOTO SKIP_LOOP
END
END

--Determining the data type of the column and depending on the data type, the VALUES part of
--the INSERT statement is generated. Care is taken to handle columns with NULL values. Also
--making sure, not to lose any data from flot, real, money, smallmomey, datetime columns
SET @Actual_Values = @Actual_Values +
CASE
WHEN @Data_Type IN ('char','varchar','nchar','nvarchar')
THEN
'COALESCE('''''''' + REPLACE(RTRIM(' + @Column_Name + '),'''''''','''''''''''')+'''''''',''NULL'')'
WHEN @Data_Type IN ('datetime','smalldatetime')
THEN
'COALESCE('''''''' + RTRIM(CONVERT(char,' + @Column_Name + ',109))+'''''''',''NULL'')'
WHEN @Data_Type IN ('uniqueidentifier')
THEN
'COALESCE('''''''' + REPLACE(CONVERT(char(255),RTRIM(' + @Column_Name + ')),'''''''','''''''''''')+'''''''',''NULL'')'
WHEN @Data_Type IN ('text','ntext')
THEN
'COALESCE('''''''' + REPLACE(CONVERT(char(8000),' + @Column_Name + '),'''''''','''''''''''')+'''''''',''NULL'')'
WHEN @Data_Type IN ('binary','varbinary')
THEN
'COALESCE(RTRIM(CONVERT(char,' + 'CONVERT(int,' + @Column_Name + '))),''NULL'')'
WHEN @Data_Type IN ('timestamp','rowversion')
THEN
CASE
WHEN @include_timestamp = 0
THEN
'''DEFAULT'''
ELSE
'COALESCE(RTRIM(CONVERT(char,' + 'CONVERT(int,' + @Column_Name + '))),''NULL'')'
END
WHEN @Data_Type IN ('float','real','money','smallmoney')
THEN
'COALESCE(LTRIM(RTRIM(' + 'CONVERT(char, ' + @Column_Name + ',2)' + ')),''NULL'')'
ELSE
'COALESCE(LTRIM(RTRIM(' + 'CONVERT(char, ' + @Column_Name + ')' + ')),''NULL'')'
END + '+' + ''',''' + ' + '

--Generating the column list for the INSERT statement
SET @Column_List = @Column_List + @Column_Name + ','

SKIP_LOOP: --The label used in GOTO

SELECT @Column_ID = MIN(ORDINAL_POSITION)
FROM INFORMATION_SCHEMA.COLUMNS (NOLOCK)
WHERE TABLE_NAME = @table_name AND
ORDINAL_POSITION > @Column_ID AND
(@owner IS NULL OR TABLE_SCHEMA = @owner)


--Loop ends here!
END

--To get rid of the extra characters that got concatenated during the last run through the loop
SET @Column_List = LEFT(@Column_List,len(@Column_List) - 1)
SET @Actual_Values = LEFT(@Actual_Values,len(@Actual_Values) - 6)

IF LTRIM(@Column_List) = ''
BEGIN
RAISERROR('No columns to select. There should at least be one column to generate the output',16,1)
RETURN -1 --Failure. Reason: Looks like all the columns are ommitted using the @cols_to_exclude parameter
END

--Forming the final string that will be executed, to output the INSERT statements
IF (@include_column_list <> 0)
BEGIN
SET @Actual_Values =
'SELECT ' +
CASE WHEN @top IS NULL OR @top < 0 THEN '' ELSE ' TOP ' + LTRIM(STR(@top)) + ' ' END + '''' + RTRIM(@Start_Insert) + ' ''+' + '''(' + RTRIM(@Column_List) + '''+' + ''')''' + ' +''VALUES(''+ ' + @Actual_Values + '+'')''' + ' ' + COALESCE(@from,' FROM ' + CASE WHEN @owner IS NULL THEN '' ELSE '[' + LTRIM(RTRIM(@owner)) + '].' END + '[' + rtrim(@table_name) + ']' + '(NOLOCK)') END ELSE IF (@include_column_list = 0) BEGIN SET @Actual_Values = 'SELECT ' + CASE WHEN @top IS NULL OR @top < 0 THEN '' ELSE ' TOP ' + LTRIM(STR(@top)) + ' ' END + '''' + RTRIM(@Start_Insert) + ' '' +''VALUES(''+ ' + @Actual_Values + '+'')''' + ' ' + COALESCE(@from,' FROM ' + CASE WHEN @owner IS NULL THEN '' ELSE '[' + LTRIM(RTRIM(@owner)) + '].' END + '[' + rtrim(@table_name) + ']' + '(NOLOCK)') END --Determining whether to ouput any debug information IF @debug_mode =1 BEGIN PRINT '/*****START OF DEBUG INFORMATION*****' PRINT 'Beginning of the INSERT statement:' PRINT @Start_Insert PRINT '' PRINT 'The column list:' PRINT @Column_List PRINT '' PRINT 'The SELECT statement executed to generate the INSERTs' PRINT @Actual_Values PRINT '' PRINT '*****END OF DEBUG INFORMATION*****/' PRINT '' END PRINT '--INSERTs generated by ''sp_generate_inserts'' stored procedure written by Vyas' PRINT '--Build number: 22' PRINT '--Problems/Suggestions? Contact Vyas @ vyaskn@hotmail.com' PRINT '--http://vyaskn.tripod.com' PRINT '' PRINT 'SET NOCOUNT ON' PRINT '' --Determining whether to print IDENTITY_INSERT or not IF (@IDN <> '')
BEGIN
PRINT 'SET IDENTITY_INSERT ' + QUOTENAME(COALESCE(@owner,USER_NAME())) + '.' + QUOTENAME(@table_name) + ' ON'
PRINT 'GO'
PRINT ''
END


IF @disable_constraints = 1 AND (OBJECT_ID(QUOTENAME(COALESCE(@owner,USER_NAME())) + '.' + @table_name, 'U') IS NOT NULL)
BEGIN
IF @owner IS NULL
BEGIN
SELECT 'ALTER TABLE ' + QUOTENAME(COALESCE(@target_table, @table_name)) + ' NOCHECK CONSTRAINT ALL' AS '--Code to disable constraints temporarily'
END
ELSE
BEGIN
SELECT 'ALTER TABLE ' + QUOTENAME(@owner) + '.' + QUOTENAME(COALESCE(@target_table, @table_name)) + ' NOCHECK CONSTRAINT ALL' AS '--Code to disable constraints temporarily'
END

PRINT 'GO'
END

PRINT ''
PRINT 'PRINT ''Inserting values into ' + '[' + RTRIM(COALESCE(@target_table,@table_name)) + ']' + ''''


--All the hard work pays off here!!! You'll get your INSERT statements, when the next line executes!
EXEC (@Actual_Values)

PRINT 'PRINT ''Done'''
PRINT ''


IF @disable_constraints = 1 AND (OBJECT_ID(QUOTENAME(COALESCE(@owner,USER_NAME())) + '.' + @table_name, 'U') IS NOT NULL)
BEGIN
IF @owner IS NULL
BEGIN
SELECT 'ALTER TABLE ' + QUOTENAME(COALESCE(@target_table, @table_name)) + ' CHECK CONSTRAINT ALL' AS '--Code to enable the previously disabled constraints'
END
ELSE
BEGIN
SELECT 'ALTER TABLE ' + QUOTENAME(@owner) + '.' + QUOTENAME(COALESCE(@target_table, @table_name)) + ' CHECK CONSTRAINT ALL' AS '--Code to enable the previously disabled constraints'
END

PRINT 'GO'
END

PRINT ''
IF (@IDN <> '')
BEGIN
PRINT 'SET IDENTITY_INSERT ' + QUOTENAME(COALESCE(@owner,USER_NAME())) + '.' + QUOTENAME(@table_name) + ' OFF'
PRINT 'GO'
END

PRINT 'SET NOCOUNT OFF'


SET NOCOUNT OFF
RETURN 0 --Success. We are done!
END

GO

PRINT 'Created the procedure'
GO


--Mark procedure as system object
EXEC sys.sp_MS_marksystemobject sp_generate_inserts
GO

PRINT 'Granting EXECUTE permission on sp_generate_inserts to all users'
GRANT EXEC ON sp_generate_inserts TO public

SET NOCOUNT OFF
GO

PRINT 'Done'

/*
Usage:

Example 1: To generate INSERT statements for table 'titles':
EXEC sp_generate_inserts 'titles'
Example 2: To ommit the column list in the INSERT statement: (Column list is included by default)
NOTE: If you have too many columns, you are advised to ommit column list, as shown below, to avoid erroneous results
EXEC sp_generate_inserts 'titles', @Include_Column_List = 0
Example 3: To generate INSERT statements for 'titlesCopy' table from 'titles' table:
EXEC sp_generate_inserts 'titles', 'titlesCopy'
Example 4: To generate INSERT statements for 'titles' table for only those titles which contain the word 'Computer' in them:
EXEC sp_generate_inserts 'titles', @From = "from titles where title like '%Computer%'"
Example 5: To specify that you want to include TIMESTAMP column's data as well in the INSERT statement:
NOTE: By default TIMESTAMP column's data is not scripted
EXEC sp_generate_inserts 'titles', @Include_Timestamp = 1
Example 6: To print the debug information:
EXEC sp_generate_inserts 'titles', @debug_mode = 1
Example 7: If you are not the owner of the table, use @owner parameter to specify the owner name:
NOTE: To use this option, you must have SELECT permissions on that table
EXEC sp_generate_inserts Nickstable, @owner = 'Nick'
Example 8: To generate INSERT statements for the rest of the columns excluding images:
NOTE: When using this otion, DO NOT set @include_column_list parameter to 0
EXEC sp_generate_inserts imgtable, @ommit_images = 1
Example 9: To generate INSERT statements for the rest of the columns excluding IDENTITY column:
EXEC sp_generate_inserts mytable, @ommit_identity = 1
Example 10: To generate INSERT statements for the top 10 rows in the table:
EXEC sp_generate_inserts mytable, @top = 10
Example 11: To generate INSERT statements only with the columns you want:
EXEC sp_generate_inserts titles, @cols_to_include = "'title','title_id','au_id'"
Example 12: To generate INSERT statements by ommitting some columns:
EXEC sp_generate_inserts titles, @cols_to_exclude = "'title','title_id','au_id'"
Example 13: To avoid checking the foreign key constraints while loading data with INSERT statements:
NOTE: The @disable_constraints option will disable foreign key constraints, by assuming that the source data is valid and referentially sound
EXEC sp_generate_inserts titles, @disable_constraints = 1
Example 14: To avoid scripting data from computed columns:
EXEC sp_generate_inserts MyTable, @ommit_computed_cols = 1

*/

[T-SQL] Script for Split Function

CREATE FUNCTION dbo.Split(@String varchar(8000), @Delimiter char(1))
returns @temptable TABLE (items varchar(8000))
as
begin
declare @idx int
declare @slice varchar(8000)

select @idx = 1
if len(@String)<1 or @String is null return while @idx!= 0 begin set @idx = charindex(@Delimiter,@String) if @idx!=0 set @slice = left(@String,@idx - 1) else set @slice = @String if(len(@slice)>0)
insert into @temptable(Items) values(@slice)

set @String = right(@String,len(@String) - @idx)
if len(@String) = 0 break
end
return
end
--how to use
select top 10 * from dbo.split('Karachi , Lahore , Isb',',')

Monday, August 16, 2010

[T-SQL] Shrink log in all your user databases

DECLARE @SQL NVARCHAR(MAX)

SELECT @SQL = COALESCE(@SQL + CHAR(13) + CHAR(10),'') + N'
Use ' + QUOTENAME(d.[name]) + ';' + CHAR(13) + '
ALTER DATABASE ' + QUOTENAME(d.[name]) + ' SET RECOVERY SIMPLE;
DBCC SHRINKFILE (' + QUOTENAME(mf.[name],'''') + ', 1);
ALTER DATABASE ' + QUOTENAME(d.[name]) + ' SET RECOVERY FULL;'
FROM sys.databases d
INNER JOIN sys.master_files mf ON [d].[database_id] = [mf].[database_id]
WHERE
d.[database_id] > 4 --no sys dbs
AND d.recovery_model = 1
AND d.is_read_only = 0
AND mf.[type] = 1 --log files
ORDER BY d.name

--print @SQL

EXECUTE (@SQL)

[T-SQL] How to search for all words inclusive without using Full Text search

DECLARE @MyTable TABLE (Id INT IDENTITY(1,1), Searched VARCHAR(200))
DECLARE @Keys TABLE (Word VARCHAR(200), WordId INT IDENTITY(1,1))

INSERT INTO @MyTable VALUES ('Mother Father Daughter Son')
INSERT INTO @MyTable VALUES ('Mother Daughter Son')
INSERT INTO @MyTable VALUES ('Mother Son')
INSERT INTO @MyTable VALUES ('Daughter Son')
INSERT INTO @MyTable VALUES ('Mother Father Son')
INSERT INTO @MyTable VALUES ('Son Daughter Father')
INSERT INTO @MyTable VALUES ('Mother Son')
INSERT INTO @MyTable VALUES ('Other Word')
INSERT INTO @MyTable VALUES ('Mother Father Daughter Brother Son')
INSERT INTO @MyTable VALUES ('Mother Daughter Son Stepdaughter')
INSERT INTO @MyTable VALUES ('Mother Son And Stepson and Daughter and Father and Grandfather')
INSERT INTO @MyTable VALUES ('Daughter Son Family')
INSERT INTO @MyTable VALUES ('Mother Brother Father Son Orphan')
INSERT INTO @MyTable VALUES ('Son or Daughter or Father')
INSERT INTO @MyTable VALUES ('Mother And Son')
INSERT INTO @MyTable VALUES ('Other Word One More')

INSERT INTO @Keys VALUES ('Mother')
INSERT INTO @Keys VALUES ('Father')
INSERT INTO @Keys VALUES ('Son')
INSERT INTO @Keys VALUES ('Daughter')

DECLARE @nAllWords INT
SELECT @nAllWords = COUNT(*) FROM @Keys

SELECT MyTable.*
FROM @MyTable MyTable
INNER JOIN (SELECT MyTable.Id
FROM @MyTable MyTable
INNER JOIN @Keys KeyWords ON ' ' + MyTable.Searched + ' ' LIKE '% ' + KeyWords.Word + ' %'
GROUP BY MyTable.Id
HAVING COUNT(DISTINCT(KeyWords.Word)) = @nAllWords) Tbl1 ON MyTable.Id = Tbl1.Id

Wednesday, August 11, 2010

[T-SQL] How to see sql server configurations

SELECT *
FROM sys.configurations

[T-SQL] Introduction to BINARY_CHECKSUM and Working Example

Following example is modified from the original example taken from BOL.

USE AdventureWorks;
GO
-- Create table
IF EXISTS (SELECT * FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'[dbo].[myTable]') AND TYPE IN (N'U'))
DROP TABLE [dbo].[myTable]
GO
CREATE TABLE myTable (column1 INT, column2 VARCHAR(256));
GO
-- Insert Value
INSERT INTO myTable VALUES (1, 'test');
GO
-- Check the Checksum
SELECT BINARY_CHECKSUM(*) AS BCH FROM myTable;
GO
-- Change the value
UPDATE myTable SET column2 = 'TEST';
GO
-- Check the Checksum
SELECT BINARY_CHECKSUM(*) AS BCH FROM myTable;
GO
-- Change the value
UPDATE myTable SET column2 = 'test';
GO
-- Check the Checksum
SELECT BINARY_CHECKSUM(*) AS BCH FROM myTable;
GO
-- Clean up
DROP TABLE [dbo].[myTable]
GO

Thursday, August 5, 2010

[T-SQL] Difference Between GRANT and WITH GRANT

Let us first see the syntax and analyze.

GRANT:
USE master;
GRANT VIEW ANY DATABASE TO username;
GO

WITH GRANT:
USE master;
GRANT VIEW ANY DATABASE TO username WITH GRANT OPTION;
GO


The difference between these options is very simple. In case of only GRANT, the username cannot grant the same permission to other users. On the other hand, with the option WITH GRANT, the username will be able to give the permission after receiving requests from other users.

[T-SQL] Script to Take Database Offline – Take Database Online

-- Create Test DB
CREATE DATABASE [myDB]
GO
-- Take the Database Offline
ALTER DATABASE [myDB] SET OFFLINE WITH
ROLLBACK IMMEDIATE
GO
-- Take the Database Online
ALTER DATABASE [myDB] SET ONLINE
GO
-- Clean up
DROP DATABASE [myDB]
GO

[T-SQL] Attach mdf file without ldf file in Database

USE [master]
GO
-- Method 1: I use this method
EXEC sp_attach_single_file_db @dbname='TestDb',
@physname=N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf'
GO
-- Method 2:
CREATE DATABASE TestDb ON
(FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA\TestDb.mdf')
FOR ATTACH_REBUILD_LOG
GO

[T-SQL] GUID vs INT

INT
Advantage:

1.Numeric values (and specifically integers) are better for performance when used in joins, indexes and conditions.
2.Numeric values are easier to understand for application users if they are displayed.
Disadvantage:

1.If your table is large, it is quite possible it will run out of it and after some numeric value there will be no additional identity to use.
GUID
Advantage:

1.Unique across the server.
Disadvantage:

1.String values are not as optimal as integer values for performance when used in joins, indexes and conditions.
2.More storage space is required than INT

[T-SQL] Disable Clustered Index and Data Insert

Let us go over this example where inserting the data is not possible when a clustered index is disabled.

USE AdventureWorks
GO
-- Create Table
CREATE TABLE [dbo].[TableName](
[ID] [int] NOT NULL,
[FirstCol] [varchar](50) NULL,
CONSTRAINT [PK_TableName] PRIMARY KEY CLUSTERED
([ID] ASC)
)
GO
-- Create Nonclustered Index
CREATE UNIQUE NONCLUSTERED INDEX [IX_NonClustered_TableName] ON [dbo].[TableName]
([FirstCol] ASC)
GO
-- Populate Table
INSERT INTO [dbo].[TableName]
SELECT 1, 'First'
UNION ALL
SELECT 2, 'Second'
UNION ALL
SELECT 3, 'Third'
GO
-- Disable Nonclustered Index
ALTER INDEX [IX_NonClustered_TableName] ON [dbo].[TableName] DISABLE
GO
-- Insert Data should work fine
INSERT INTO [dbo].[TableName]
SELECT 4, 'Fourth'
UNION ALL
SELECT 5, 'Fifth'
GO
-- Disable Clustered Index
ALTER INDEX [PK_TableName] ON [dbo].[TableName] DISABLE
GO
-- Insert Data will fail
INSERT INTO [dbo].[TableName]
SELECT 6, 'Sixth'
UNION ALL
SELECT 7, 'Seventh'
GO
/*
Error: Msg 8655, Level 16, State 1, Line 1
The query processor is unable to produce a plan because the index 'PK_TableName' on table or view 'TableName' is disabled.
*/
-- Reorganizing Index will also throw an error
ALTER INDEX [PK_TableName] ON [dbo].[TableName] REORGANIZE
GO
/*
Error: Msg 1973, Level 16, State 1, Line 1
Cannot perform the specified operation on disabled index 'PK_TableName' on table 'dbo.TableName'.
*/
-- Rebuliding should work fine
ALTER INDEX [PK_TableName] ON [dbo].[TableName] REBUILD
GO
-- Insert Data should work fine
INSERT INTO [dbo].[TableName]
SELECT 6, 'Sixth'
UNION ALL
SELECT 7, 'Seventh'
GO
-- Clean Up
DROP TABLE [dbo].[TableName]
GO

[T-SQL] DATE and TIME in SQL Server 2008

Example of Date and Time

DECLARE @varDate AS DATE
DECLARE @varTime AS TIME
SET @varDate = '10/10/2010'
SET @varTime = '12:12:12'
SELECT CAST(@varDate AS DATETIME) C_Date
SELECT CAST(@varTime AS DATETIME) C_Time

Result would be

2010-10-10 00:00:00
1900-01-01 12:12:12

[T-SQL] Merge Operator

Let us run the same example which I have used earlier; I am listing the same here again for convenience.

--Let’s create Student Details and StudentTotalMarks and inserted some records.
USE tempdb
GO
CREATE TABLE StudentDetails
(
StudentID INTEGER PRIMARY KEY,
StudentName VARCHAR(15)
)
GO
INSERT INTO StudentDetails
VALUES(1,'SMITH')
INSERT INTO StudentDetails
VALUES(2,'ALLEN')
INSERT INTO StudentDetails
VALUES(3,'JONES')
INSERT INTO StudentDetails
VALUES(4,'MARTIN')
INSERT INTO StudentDetails
VALUES(5,'JAMES')
GO
CREATE TABLE StudentTotalMarks
(
StudentID INTEGER REFERENCES StudentDetails,
StudentMarks INTEGER
)
GO
INSERT INTO StudentTotalMarks
VALUES(1,230)
INSERT INTO StudentTotalMarks
VALUES(2,255)
INSERT INTO StudentTotalMarks
VALUES(3,200)
GO
-- Select from Table
SELECT *
FROM StudentDetails
GO
SELECT *
FROM StudentTotalMarks
GO
-- Merge Statement
MERGE StudentTotalMarks AS stm
USING (SELECT StudentID,StudentName FROM StudentDetails) AS sd
ON stm.StudentID = sd.StudentID
WHEN MATCHED AND stm.StudentMarks > 250 THEN DELETE
WHEN MATCHED THEN UPDATE SET stm.StudentMarks = stm.StudentMarks + 25
WHEN NOT MATCHED THEN
INSERT(StudentID,StudentMarks)
VALUES(sd.StudentID,25);
GO
-- Select from Table
SELECT *
FROM StudentDetails
GO
SELECT *
FROM StudentTotalMarks
GO
-- Clean up
DROP TABLE StudentDetails
GO
DROP TABLE StudentTotalMarks
GO

[T-SQL] Precision of SMALLDATETIME – A 1 Minute Precision

SQL SERVER – Precision of SMALLDATETIME – A 1 Minute Precision
June 1, 2010 by pinaldave

I am myself surprised that I am writing this post today. I am going to present one of the very known facts of SQL Server SMALLDATETIME datatype. Even though this is a very well-known datatype, many a time, I have seen developers getting confused with precision of the SMALLDATETIME datatype.

The precision of the datatype SMALLDATETIME is 1 minute. It discards the seconds by rounding up or rounding down any seconds greater than zero. Let us see the following example

DECLARE @varSDate AS SMALLDATETIME
SET @varSDate = '1900-01-01 12:12:01'
SELECT @varSDate C_SDT
SET @varSDate = '1900-01-01 12:12:29'
SELECT @varSDate C_SDT
SET @varSDate = '1900-01-01 12:12:30'
SELECT @varSDate C_SDT
SET @varSDate = '1900-01-01 12:12:59'
SELECT @varSDate C_SDT

[T-SQL] Difference Between GETDATE and SYSDATETIME

Sometime something so simple skips our mind. I never knew the difference between GETDATE and SYSDATETIME. I just ran simple query as following and realized the difference.

SELECT GETDATE() fn_GetDate, SYSDATETIME() fn_SysDateTime

In case of GETDATE the precision is till miliseconds and in case of SYSDATETIME the precision is till nanoseconds.

[T-SQL] Self Join – Inner Join and Outer Join

USE TempDb
GO
-- Create a Table
CREATE TABLE Employee(
EmployeeID INT PRIMARY KEY,
Name NVARCHAR(50),
ManagerID INT
)
GO
-- Insert Sample Data
INSERT INTO Employee
SELECT 1, 'Mike', 3
UNION ALL
SELECT 2, 'David', 3
UNION ALL
SELECT 3, 'Roger', NULL
UNION ALL
SELECT 4, 'Marry',2
UNION ALL
SELECT 5, 'Joseph',2
UNION ALL
SELECT 7, 'Ben',2
GO
-- Check the data
SELECT *
FROM Employee
GO



We will now use inner join to find the employees and their managers’ details.

-- Inner Join
SELECT e1.Name EmployeeName, e2.name AS ManagerName
FROM Employee e1
INNER JOIN Employee e2
ON e1.ManagerID = e2.EmployeeID
GO



From the result set, we can see that all the employees who have a manager are visible. However we are unable to find out the top manager of the company as he is not visible in our resultset. The reason for the same is that due to inner join, his name is filtered out. Inner join does not bring any result which does not have manager id. Let us convert Inner Join to Outer Join and then see the resultset.

-- Outer Join
SELECT e1.Name EmployeeName, ISNULL(e2.name, 'Top Manager') AS ManagerName
FROM Employee e1
LEFT JOIN Employee e2
ON e1.ManagerID = e2.EmployeeID
GO



Once we convert Inner Join to Outer Join, we can see the Top Manager as well. Here we have seen how Self Join can behave as an inner join as well as an outer join.