Why Functions in WHERE clauses should be avoided

Consider we have the following query:

SELECT ModifiedDate FROM Contacts 
WHERE DATEDIFF(MINUTE, ModifiedDate, GETDATE()) > 0

This query uses the DATEDIFF function, and when we look at the query plan we can see a Index scan is being performed.

2017-12-18_12h31_43

The issue with an index scan is that all rows are reviewed before returning a result

And now consider this query:

SELECT ModifiedDate FROM Contacts 
WHERE GETDATE() > ModifiedDate

Since we are not using a function in the WHERE clause, this query uses an Index seek.

2018-01-10_17h13_47.png

An index seek is much more efficient then an index scan when there is an index on the column

Advertisements

Useful SQL Snippets

Returns first day of a week (Most recent Monday)

SELECT DATEADD(wk, DATEDIFF(wk,0,GETDATE()), 0)

Returns Date of Previous Sunday

SELECT DATEADD(wk, DATEDIFF(wk,0,GETDATE()), -1)

To get midnight of a timestamp

DECLARE @dtTimestamp DATETIME = '2017-10-12 15:26:00.000'
SET @dtTimestamp = DATEADD(DAY, DATEDIFF(DAY, 0, @dtTimestamp + 1), 0)
PRINT @dtTimeStamp

To concatenate an INT and Varchar

SET ConcatString = ('Name' + CAST(ID AS VARCHAR)

Clearing Caches in SQL Server

The following are a few useful commands for Database Performance testing:

  • DBCC FREEPROCCACHE – Clears the procedure cache and forces SQL Server to recompile procedure plans the next time a respective procedure is run
  • DBCC DROPCLEANBUFFERS – Clears the SQL Server data cache i.e any data that exists in the buffer cache from previous execution of queries is removed

$q – Promises in Javascript

$q is a service in AngularJS that helps a developer run functions asynchronously, and use their return values which are promised, when the asynchronous function has finished processing.

A promise is a special type of Object that we can either use, or construct ourselves to handle asynchronous tasks. We deem them promises because we are “promised” a result at a future point in time. For example an HTTP call could complete in 200ms or 400ms, a promise will execute when resolved

A promise is said to be fulfilled when we get a result from a particular action. Conversely, a promise is said to be rejected when we do not receive a response

Deferred API

A new instance of deferred is constructed by calling $q.defer()

A deferred object can either be fulfilled using .resolve(data) or rejected using .reject(error)

Promise API

A new promise instance is created when a deferred instance is created and can be retrieved by calling deferred.promise. The purpose of the promise object is to allow for interested parties to get access to the result of the deferred task when it completes.

 

A call to a function that returns a promise should have a .then() function that receives an inline function to execute when the promise is returned.

The function that returns a promise should hook off to the service/endpoint it’s connecting to etc to get data, and have it’s own then() function that contains an inline function to execute when the actual data is returned.

http://andyshora.com/promises-angularjs-explained-as-cartoon.html

https://toddmotto.com/promises-angular-q

http://haroldrv.com/2015/02/understanding-angularjs-q-service-and-promises/

https://docs.angularjs.org/api/ng/service/$q

Creating and Applying Git Patches

To export a Git Patch, Use

git format-patch -1 <sha>

where <sha> is the Git ChangesetID, OR Use

git format-patch -1 HEAD

if the last commit is the commit you want to export

 

To view changes contained in a Git Patch, Use

git apply --stat patchName.patch

To test the patch before you actually apply it, Use

git apply --check patchName.patch

To apply a patch, Use

git am patchName.patch

If a patch fails, Use the following to abort

git am --abort

Bulk Inserting Data into a Database

Bulk inserting data into a database table can be a very useful operation for the following reasons:

  • You may have data stored in a file that you need to insert all at once
  • You may also want to execute a table TRIGGER once, if a cursor inputted data then a Trigger would be executed on each insert which would make the operation much less performant. Bulk insertions ensure that a TRIGGER is executed only once regardless of the amount of data inserted.

 

Methods of Bulk Insertion

1. Bulk Inserting Data from a file

Consider the scenario where we have customer details stored in a csv file and want to insert these records into a customer table in our database.

BULK INSERT Customers FROM 'C:\Customers.csv'
 WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n')
 GO

In relation to executing triggers, the BULK INSERT command actually has a mechanism for executing a TRIGGER on each INSERT if required. If we wanted to execute a trigger for an insert of each row from the Customers.csv file our BULK INSERT statement would look like the following:

BULK INSERT Customers FROM 'C:\Customers.csv'
 WITH (FIELDTERMINATOR = ',', ROWTERMINATOR = '\n', FIRETRIGGERS)
 GO

Omitting FIRETRIGGERS means that no TRIGGERs will be fired.

 

2. Bulk Inserting Data using a temporary table

Using the same example of the Customers table like above, except this time we will be inserting data row by row into a Temp table, and then insert all the contents of this Temp table into the Customers table.

The main requirement here is that the Customers table and the CustomersTemp table must have the same definition:

DECLARE @Customers AS TABLE 
	([CustomerID] [int] NOT NULL, [Age] [int] NULL, [Data] [nvarchar](max) NULL)

DECLARE @CustomersTemp AS TABLE 
	([CustomerID] [int] NOT NULL, [Age] [int] NULL, [Data] [nvarchar](max) NULL)

For example, if we had a script that created bulk Customers, we could have a cursor that inserts each record into the @CustomersTemp temporary table. Once all the data has been inserted to this temp table, we insert the contents of the temp table into the actual table like follows:

INSERT INTO @Customers (CustomerID, Age, Data)
SELECT
CustomerID, Age, Data FROM @CustomersTemp