Where in the SQL World is Mickey Stuewe

PASSFlagI’m in countdown mode right now. Countdown to Halloween. Not because I get to dress up this Friday, but because I’m leaving for my favorite time of yearPASS Summit. This will be my first year speaking at Summit, but I’m getting ahead of myself. Let’s start at the beginning.

JulieAndMickeyMentorsOne of the most amazing things about Summit is the networking opportunities. These opportunities have the potential to start lifelong friendships. And that is where my post really needs to start, with my friend Julie Koesmarno. Julie and I were at most of the same events at my first Summit, but we didn’t get to know each other until after the Summit. Now she has flown in from Australia to spend a week at my place before we head out to Summit together. Thursday, two more Australian friends will be joining us here in Orange County before we fly up to Oregon together on Friday for SQL Saturday in Portland, where I’ll be presenting my Changing Your Habits to Improve the Performance of Your T-SQL session and taking as many classes as I can.

Sunday we’ll be taking a leisurely ride on a train up to Seattle. It will be the last day of relaxation before the madness of Summit related activities start.

SiTCMickeyMonday I will be spending the day hanging out with my favorite tool company, Red Gate. They are putting on a free one day conference called SQL in the City. This is an amazing day to network, eat great food, see amazing presentations, and end the day English style with a pint of beer. You heard me correctly. Red Gate is a British Company, and they always end SQL in the City with a pint of beer. The beer bottles have customized labels. Last year’s said “Query Hoptimizer”.

This year I have two presentations at SQL in the City. My first presentation is called Customize your faux test data with SQL Data Generator and focuses on how to generate test data. My second presentation is a lightning talk. I’ll be presenting on called Finding the delta with SQL Compare and backups.

Monday night I will be attending Andy Warren and Steve Jone’s networking party. (I hope to see you all there.) This is a great way to start off on the right foot at the Summit. You get to sit down to share a meal and network. I was introduced to many new people the first time I went, and yes, I remember quite a few of them. Tuesday is all about me. Ok, almost all about me. I signed up for an all-day precon on the topic of SSIS. I do have to step out in the afternoon to attend a chapter leader’s meeting though.

Then the even hits with Summit’s opening night event followed closely with several other events that I will be attending. (Sleeping has been rescheduled for later in November.)

On Wednesday, I’ve been invited to be at the bloggers table for the Day 1 Keynote. I’m very excited about this. So, if you are unable to be at the Summit this year and you want to know what is going on during the first Keynote presentation Wednesday morning at 8:15 am, then come hang out on my blog. It will automatically update until 9:45 am. Stay tuned to my Twitter feed as well, I might “practice” by live blogging one of the sessions on Saturday or Monday too.

PASS_Summit_2013_WIT-2Thursday is my big day. I will be presenting for the fourth time in a week, but it will be my very first time speaking at the PASS Summit. My presentation is entitled Techniques for Dynamic SSRS Reports. The presentation will be held in room 2AB at 10:45 am. Since it is Thursday, I’ll most likely be presenting in my Kilt. Many attendees wear kilts to celebrate Women In Technology (WIT). After I’m done with my presentation, I’ll be running over to the WIT lunch. The event has sold out, so I hope I can find a seat. Kimberly Bryant, founder of Black Girls CODE will be giving a keynote at the event. Thursday afternoon at 3:30 to 4:30 you’ll find me hanging out at the community center. (I hope you come by and say hi.) I’ll be there to network, answer questions about community, SQL, SSRS, and whatever else comes up. (All questions I can’t answer will be forwarded to Grant Fritchey since he has nothing to do until January when his first term on the PASS board starts.)

WendyMartinMickeyJuliePaulThursday night I hope to celebrate by singing SQLKaroake style at Busch Gardens, but you never know where I’ll end up. The evenings at the Summit are like being caught in a strong current. You never know where the evening will take you. The point is to enjoy the ride and meet new people.

Friday I’ll be getting my final learning on and attending the Birds of a Feather lunch. This is an event where you can sit with others to talk about specific SQL related topics. So, bring your questions and share a meal with like minded individuals.

Saturday I’ll be heading back to Orange County to give hugs to my family and my puppy before I crash and catch up on all my sleep.

T-SQL Tuesday #59 – My Hero is Scary

Tracy McKibben(b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009. This month’s invitation is about who our Hero is.

I didn’t like him at first

That’s right. I was not happy with my hero when I first met him, but maybe I should start from the beginning. It was a dark and stormy fall daywhat am I saying. I live in Southern California. It was a beautiful day in Los Angels and Red Gate was hosting their first American SQL in The City. The day was Oct 28th, 2011. I have been a Red Gate addict for a while and I was very excited about attending the events. (It’s free after all.)

I listened to presentations all day from speakers I didn’t know (yet). You see, I was not involved in the SQL Community yet. (That would take another year to happen.) The day was winding down and there was a presentation by this guy named Grant Fritchey (b|t). He was speaking about performance tuning, execution plans, and other things I didn’t understand at that time. Then he brought up User Defined Functions, which I was in love with. And he told me how bad they were, and my jaw dropped. I was not happy that he just told me my favorite toy was not good for the performance of my queries.

Even though he just delivered me some bad news, he caught my attention. You see, Grant is an amazing speaker. He gets excited about what most of the human race considers boring. He engages the audience. And he loves questions.

Fast forward a few months

While I wasn’t happy with what Grant told me, he piqued my interest. Grant had a book that I found called SQL Server Execution Plans. I read that book cover to cover, and then I started using what I learned. Within a few months, I was the go to person at work for query tuning. I was able to tune a 12 minute query down to acceptable performance because of what I had learned from Grant’s book and I was rewarded for my great work. You see, I had learned about PASS Summit by then, and Grant had a pre-conand I wanted to be in that class. Since I had done an outstanding job that summer improving the performance of quite a few lagging queries, my manager let me go to my first PASS Summit and it changed my career.

718Fan girl

One of the goals I had for Summit, was to meet a list of people I had created. Some were speakers and some were people I had gotten to know on Twitter. Grant, of course, was at the top of the list… and I had met him within an hour of my plane landing in Seattle. You see, Red Gate had another SQL in the City event. This time on Monday of the week of Summit. (They do this year too.) I wasn’t able to attend the entire event, but I did make it to Grant’s session.

I won’t bore you with the details of that whole week, but needless stay that any time I had a question and Grant was free, I was able to talk to him. Why? Because that is the kind of guy he is. He loves helping people.

Not so scary after all

While it sounds like I just have a big “geek crush” on the guy, that isn’t the caseany more. Here is a short list why I call Grant Fritchey my Hero

  • Fabulous Presenter: Grant is a fantastic presenter. He is engaging, he loves questions, and he has no problem admitting that he doesn’t know everything.
  • Great author: Let’s admit it. Technical books make better monitor stands than interesting reading. Technical books can be so monotonous, but not Grant’s books. You can here Grant in his books. His sense of humor comes out between the lines.
  • Giver at heart: As mentioned earlier, he loves to help people. You can send him a question through his blog, and he’ll reply. He is constantly speaking at events and conferences, sharing his knowledge in the SQL Family style.
  • He gives it his all: Grant doesn’t do anything half way. He is prepared when he speaks. He is prepared when he runs for office. And he is prepared when he agrees to be a mentor.
  • My mentor: At the beginning of 2013, I wanted to have a mentor. I even blogged about it here. I got the courage up and I sent Grant an email, asking if he would mentor meand he said yes. I was shocked and honored to find out that I was the first person he had “officially” mentored. He mentored me all throughout 2013.
  • Frank: Grant is not scary, but he doesn’t sugar coat his answers. I love this about him. It means that I can trust his opinion, whether they are about SQL or about my abilities, and especially about the things I still need to work on.
Shout out to Grant

Thanks Grant, for lifting me up and telling me that I can accomplish things I thought were outside my reach. Thanks for being a fabulous role model as a speaker and community leader. Thanks for being my hero and my friend.

Thanks for all the fish

Thanks also go out to Tracy McKibben for hosting this month’s T-SQL Tuesday blog party.

Questions Answered From Presentation: Changing Your Habits to Improve the Performance of Your T-SQL

QuestionMark_127880048This past Tuesday, Oct 8th, 2014, I had the privilege of speaking for the DBA Fundamentals Virtual PASS Chapter. It was my biggest audience to date and the fourth biggest audience for the user group to date. There were 374 people listening in, and they weren’t all from the US. I was thrilled to see that I had at least one person listening in from the UK.

Since there were so many people on line, I couldn’t answer all the question…thus my post today. You can download the slide deck, demos, and my sample database. (All my presentations eventually find there way to my resources page, here.)

I will post the link to the presentation on this page when it becomes available.

Questions Answered

1.    What is the name of the Red Gate tool for source control you mentioned?

Red Gate’s tool is called SQL Source Control. It’s like a bridge between your source control repository and SSMS or Visual Studio. They have a trial version that you can download here

2.    What to do when you come into a team and there was no SQL coding standards before and we do not want to spend time re-writing everything?

I would first get buy in on why standards are important to have and outline the standards you want to use. Then I would slowly change the code. What I mean by this is, update the code as you make changes to it. For instance, you need to modify stored procedure XYZ, I would add the standards to only that stored procedure or even only the part of the stored procedure that you change.

3.    What version of SSMS are you using?

In the presentation I was using SQL 2012, but all the concepts I covered applied to SQL 2005, 2008 and 2008 R2. Even the Template Browser and Template Parameters I covered are available in those versions (Talk about a well kept secret!). Note: If you need the sample database in SQL 2008 or SQL 2008R2, please let me know and I’ll see what I can do.

4.    Can you give an example of commenting within a stored procedure?

Sure. You can comment using two different ways. You can comment at the end of a line using two dashes. You can also comment any where in the code using /* and */ to encompass the comment. This allows for comments to spend multiple lines.

When I’m adding comments about my changes, I will include the date, my name and brief comment. If I’m lucky to have a work item or bug tracker application to keep track of my work, then I’ll reference that number as well. Here is an example.


USE AWMonkey
GO
/* ***********************************************************************************
Purpose:    Returns list of Territory Groups.
Notes:
Sample:
EXEC List.GetTerritory
Author:     THEZOO\mstuewe
Date:       9/16/2014

Revision History
(Change Date)    (Author)        (Description of Change)
-----------------------------------------------------------------
10/10/2014        Mickey Stuewe    #2345 Added All to the list of values as requested
By end user.
************************************************************************************ */
ALTER PROCEDURE List.GetTerritory
AS
BEGIN TRY

SET NOCOUNT ON;

SELECT
st.TerritoryID
,[Group] AS CountryRegion
,st.Name AS Territory
FROM
AdventureWorks2008R2.Sales.SalesTerritory AS st
UNION ALL
SELECT
-1
,' All'
,' All'
ORDER BY
CountryRegion; --Added the ALL to the list.
END TRY
BEGIN CATCH

DECLARE
@ErrorMessage AS nvarchar(3000)
,@ErrorSeverity AS int;

SET @ErrorMessage = ISNULL(DB_NAME(DB_ID()) + N'.' + SCHEMA_NAME(SCHEMA_ID()) + N'.'

+ OBJECT_NAME(@@PROCID, DB_ID()), N'SQL Object Name Not Available')

+ N': Error: ' + CONVERT(nvarchar(10), ERROR_NUMBER()) + N' Line: ' + CONVERT(nvarchar(5), ERROR_LINE()) + N' - ' + ERROR_MESSAGE();

SET @ErrorSeverity = ERROR_SEVERITY();
RAISERROR(@ErrorMessage, @ErrorSeverity, 1);
END CATCH;

SET NOCOUNT OFF;

5.    Is there any configuration for the Template Browser? Can it integrate with any source control?

No configuration for the Template Browser is needed. Here are the generic steps to integrate your Template Browser folder with your source control repository. Each Source Control repository will be different, but these are the general steps.

1. Open the location of the Template Browser. Since I’m using SQL 2012, I found mine here:

C:\Users\<Your User Name>\AppData\Roaming\Microsoft\SQL Server Management Studio\11.0\Templates\Sql\<My Templates>

2. Import your templates into your source control repository.

3. Set your templates folder as the “working folder” location.

4. Test by getting the latest from your repository.

5. Connect the template browsers on your team’s computers the same way.

6. Let them know when you make changes to the templates so that they can do a “get latest” to download the changes to there computers.

6.    Is there any kind of Freeware repository tool?

Yes, Subversion is an open source product and widely used repository tool. It has two parts. The first part is the repository which resides in a common location, usually on the network so that everyone can use it. Then each user needs to have a subversion client tool to connect to the repository. There are many different clients to pick from. I use Tortoise. Why? It was suggested to me and recommended by Red Gate. Since I was going to use SQL Source Control by Red Gate to connect to subversion, I decided to stick with there recommendation.

7.    So, Row_Number() is non-deterministic?

Yes, that is correct.

8.    Is there any difference or improvements in using a CAST vs CONVERT or vice versa?

There isn’t a different in using the two functions, in terms of performance. As Jeff Joy kindly pointed out, the difference between the two comes down to compliance.

Jeff Joy – CAST is ANSI – compliant while CONVERT is Microsoft-specific casting function that builds in additional formatting functionality. If you may ever need to convert to another platform you need to write code that is cross-platform compliant, use CAST.

The downside of writing all of your code to ANSI compliant standards, is you miss out on quite a few rich features that Microsoft has added to the SQL language. These rich features will help SQL statements perform faster or make your life easier. 

9.    Isn’t CTEs much better than subqueries? This is in regards to readability?

Yes CTEs (Common Table Expressions) are much easier to read. There are even patterns that can be accomplished with CTEs that can’t be accomplished with sub queries, but CTEs aren’t always faster. I have had a few queries I’ve had to re-write with sub queries because I needed every millisecond of performance.

10.    Can I see the trace options selected?

Yes, here are the events I captured. I also isolated the database, to the demo database.

 

QuestionsAnswered20141010_Image1

11.    Is there a standards document about T-SQL Formatting?

I don’t have one I can give you. I can create a template for you if you would like. Just send me an email.

12.    Can you e-mail that URL to us?

I’m not sure which URLs this is referring to, so I’ve listed the URLs that were mentioned in my presentation.

13.    Is there better performance between CTE and Apply?

Comparing CTE and APPLY is like comparing apples and oranges. They are two different things. A CTE provides you the ability to create a query to be used within another query. APPLY is an operator like INNER JOIN, but it is specifically used to join table functions (in line and multi line) as well as sub queries and CTEs.

14.    On the comparison of temporary, table variables, and materialized tables…Why was the second execution plan eligible for auto-parameterization and the others were not?

This will take a little bit of investigation since I don’t know off the top of my head. I will post this one in the next week or so.

15.    What is the proper way to use DISTINCT with lots of columns in the SELECT without duplicate records? (Per our SDLC, avoiding using DISTINCT with lots of columns.)

The DISTINCT operator causes a SORT in the execution plan which is usually a very costly operator. Unfortunately sometimes you can’t avoid it. One thing that you can do, is see if there are other ways to filter the data so that there aren’t duplicate records. For example maybe there is a column that marks the latest record as active. You could filter on that field to provide uniqueness across the rest of the fields. Another example would be joining to another table that will help reduce the records, but not alter the data you need to return.

IMHO, your SDLC is trying to avoid unnecessary DISTINCT operators in the code. I had added that same requirements to my SDLC document when DISTINCT was being abused. I later added the comment, “If you can prove the need for the DISTINCT operator, then the use of DISINCT will be approved.”

16.    About predicate and especially LIKE, we know that to put predicate more efficient, we shouldn’t start with a wildcard, but if we need to return data that end by defining string, it exists a way to improve that?

See # 17 below.

17.    REVERSE string?

I wanted to answer 16 and 17 together. #18 is related, but I answered it separately. While all the string functions are deterministic, the particular pattern that needs to be implemented (looking at the end of string), does not allow it’s self to an efficient execution plan. This has to do with how indexes are built. The keys in the index are stored sorted. This means that finding records can only be efficient when looking from left to right, not right to left when comparing strings. You can see this by running the following sample on my demo database I provided.

I created an index on the field we are looking at, then I tried various techniques to get the last three characters. All three execution plans show an Index Scan.


USE DemoProgramming
GO

IF  EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N'dbo.TestData') AND name = N'IX_TestData_SomeDateText')
DROP INDEX IX_TestData_SomeDateText ON dbo.TestData WITH ( ONLINE = OFF )
GO
CREATE NONCLUSTERED INDEX IX_TestData_SomeDateText ON dbo.TestData
(
SomeDateText ASC
)

SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
td.SomeDateText LIKE '%-09'

SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
REVERSE(SomeDateText) LIKE '90-%'

SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
LEFT(REVERSE(SomeDateText),3) = '90-'

When you run this sample, you do get an Index seek.


SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
SomeDateText LIKE '2004%'

With all that said, if a particular column is constantly being split into separate parts to be searched, then I would consider storing those separate parts in fields in the database. That way they can be indexed and improve performance.

18.    Full Text index vs Like?

I have not had the opportunity to use or even investigate Full Text indexes, so I can’t comment on this at this time. (sorry)

19.    Is it ok to manually create tables in the temp DB? Would that be the same as creating materialized tables, but will be wiped out after a DB server restart?

Temp tables and global temp tables are tables with in the TempDB. As far as creating materialized or permanent tables within in TempDB, that is not a good idea. TempDB can already be burdened by all the other queries that use temp tables, table variables, and worktables. It would be better to create materialized tables outside of the TempDB.

20.    For the Template explorer…how do you share your templates to other users in SQL because when we create templates they seem to only be available to the user specifically that created them?

See Question #1.

21.    What’s the best hosting solution for MSSQL server standard/enterprise?

This is outside the scope of this session and an area I don’t have a lot of experience with.

22.    Does including Actual Execution plan also puts an overhead on the server?

No. The Actual Execution plan is what is created when a query is ran. The query itself can cause overhead, but not the plan. In fact, the engine allows a specific amount of time to find the best plan. If it can’t go through all the different possibilities, then it will do the best it can with the possibilities it ran through . You can see this in the execution plan. Click on the SELECT icon and look at the properties. For this query, it told me it found a “good enough plan”.

 

QuestionsAnswered20141010_Image2

 

If you want to get the last execution plan that was generated for a query, then you can get it using sys.dm_exec_query_plan. See how to use this DMV here.

Last but not least…

23.    What’s your favorite whisky?

Well in the immortal words of Grant Fritchey…that depends. It depends on where I am, what I’m eating, and how many people I’m with. (I know, that is a weird one. But I find I can handle the smokier Scotchs when I’m around a group a friends, but not when I’m home relaxing.) Also, I drink all my whiskey neat.

So, since I don’t have one favorite, I’ll answer the question by country in the order of my preference.

  • Ireland – I don’t think I’ve met an Irish whisky I didn’t like, but my favorite (right now) is Redbreast.
  • USA – I love 100% Rye whiskey! Mitcher’s Rye and Highwest Rendevous Rye are at the top of my list.
  • Scotland – I like the Fruity and Spicy Scotch’s from Speyside, with Oban being at the top of list.
But wait there is more

Mark Finch mentioned a free formatting tool. I wanted to include it in my summary since I had never heard of it. You can find it at poorsql.com.

Thanks for all the fish

I want to thank everyone for attending my presentation, as well as for all the encouraging feed back I received.

Going Beyond The INSERT Statement

The Seventh Mission

In this installment of SQLCoOp, we are writing about CRUD. No, this is not about the stuff you scrape off the bottom of your shoes after hiking. This is Create, Read, Update, and Delete. I’ll be focusing on create, which is done with the INSERT Statement, but I’m going to go beyond the basic INSERT statement.

MCJJ_ALL

Multiple Rows

Let’s start with a new feature of the basic INSERTVALUES statement that we all know and love. In SQL Server 2012, Microsoft improved the INSERTVALUES statement, by allowing you to include multiple lists of values in the same query. This allows you to insert multiple rows of data with one INSERT statement. I find this syntax to to be a wonderful feature when I’m creating small sets of data using Excel.

Here is an example of what the syntax looks like. The values for each row are enclosed in parenthesis and separated by commas. In this example, I’m entering student scores for tests the students took at the end of September. After this one query runs, there will be 12 rows in the table.


CREATE TABLE StudentScore
(
     StudentScoresID int IDENTITY PRIMARY KEY
     ,StudentID int
     ,TestingRoomID int
     ,Score tinyint
     ,TestDateTime datetime
);

 

INSERT INTO StudentScore
     (StudentID,TestingRoomID, Score, TestDateTime)
VALUES
     (35, 1, 110, '2014-09-30 17:00:00.00')
     ,(36, 1, 87, '2014-09-30 17:00:00.00')
     ,(42, 1, 94, '2014-09-30 17:00:00.00')
     ,(10, 1, 99, '2014-09-29 12:00:00.00')
     ,(35, 10, 90, '2014-09-29 17:00:00.00')
     ,(36, 10, 100, '2014-09-29 17:00:00.00')
     ,(42, 10, 105, '2014-09-29 17:00:00.00')
     ,(10, 10, 99, '2014-09-29 17:00:00.00')
     ,(35, 5, 115, '2014-09-28 17:00:00.00')
     ,(36, 5, 70, '2014-09-28 17:00:00.00')
     ,(42, 5, 90, '2014-09-28 17:00:00.00')
     ,(10, 5, 67, '2014-09-28 17:00:00.00');
Common Table Expressions (CTEs)

I love CTEs, sometimes a little too much. The reason I think they are so great is two fold. First, they can make complicated queries easier to read and maintain. Second, they can perform complicated functionality such as recursion without breaking a query up into multiple statements. An added benefit, is they can be used with all CRUD statements, including the INSERT statement.

Let’s take a look at an example.

I have a group of students that were given the same test multiple times in different classrooms on different dates. I want to only insert the row representing  the highest score a student had. I also want to make sure that I keep the first time they received this score and any other data that came with that record.

Below I’ve highlighted the rows that I want to insert into the new table.

SQLCoOp7_Image1

The first step is building the SELECT statement for the CTE that numbers the scores for each student. This is done by using the ROW_NUMBER function in conjunction with the OVER clause. The OVER clause will have two parts. The first part will use the PARTION BY statement. This is used to restart the numbering for each partition. In this case we will partition the data by StudentID. The second part will use the ORDER BY statement. This tells us how to order the data so that it will consistently come out in the same row number. Since we want the highest score first, we will order the data by Score in a descending order. Then we will order the duplicates by TestDateTime so that the first date the score was achieved will appear first in the order.

SELECT
     ss.StudentScoresID
     ,ss.StudentID
     ,ss.TestingRoomID
     ,ss.Score
     ,ss.TestDateTime
     ,ROW_NUMBER() OVER (PARTITION BY ss.StudentID
               ORDER BY ss.Score DESC, ss.TestDateTime) AS RowID
FROM
     dbo.StudentScore AS ss

This SQL statement above will be used in our CTE. We’ll then use all the rows that have a RowID of 1 for the INSERT. The final INSERT statement will look like this.

CREATE TABLE dbo.StudentScoreHigh
(
     StudentScoresID int PRIMARY KEY
     ,StudentID int
     ,TestingRoomID int
     ,Score tinyint
     ,TestDateTime datetime
);

 

WITH cte_Scores
AS
(
     SELECT
          ss.StudentScoresID
          ,ss.StudentID
          ,ss.TestingRoomID
          ,ss.Score
          ,ss.TestDateTime
          ,ROW_NUMBER() OVER (PARTITION BY ss.StudentID
               ORDER BY ss.Score DESC, ss.TestDateTime) AS RowID
     FROM
          dbo.StudentScore AS ss
)
INSERT INTO dbo.StudentScoreHigh
(
     StudentScoresID
     ,StudentID
     ,TestingRoomID
     ,Score
     ,TestDateTime
)
SELECT
     StudentScoresID
     ,StudentID
     ,TestingRoomID
     ,Score
     ,TestDateTime
FROM
     cte_Scores
WHERE
     RowID = 1;
MERGE

A different way to approach this same problem, would be to use the MERGE statement. The MERGE statement, is like CRUD on steroids. It will look at all the data you specify that is coming in and compare to all the existing data. Then you can determine what will happen. Do you want to insert, update, delete, or ignore the data based on what does and does not match.

For this example, we’ll assume we are constantly updating the StudentScoreHigh table with the students highest score. We’ll compare what is already in the StudentScoreHigh table (the target) with what is in the StudentScore table (the Source) each night. If there is a record already in the table and the student achieved a higher score today, than the existing record will be updated. If the student does not exist in the table, then the student will be inserted into the table. Note: for this example, we’ll assume that the student could only take the test once in a day.

The MERGE statement will look like this.


MERGE dbo.StudentScoreHigh AS target
USING
(
     SELECT
          StudentScoresID
          ,StudentID
          ,TestingRoomID
          ,Score
          ,TestDateTime
     FROM
          dbo.StudentScore AS ss
     WHERE
          TestDateTime >= DATEADD(d, -1, GETDATE())
) AS Source ON Target.StudentID = Source.StudentID
WHEN MATCHED AND Source.Score > Target.Score THEN
UPDATE SET
     StudentScoresID = Source.StudentScoresID
     ,StudentID = Source.StudentID
     ,TestingRoomID = Source.TestingRoomID
     ,Score = Source.Score
     ,TestDateTime = Source.TestDateTime
WHEN NOT MATCHED THEN
INSERT
(
     StudentScoresID
     ,StudentID
     ,TestingRoomID
     ,Score
     ,TestDateTime
)
VALUES
(
     Source.StudentScoresID
     ,Source.StudentID
     ,Source.TestingRoomID
     ,Source.Score
     ,Source.TestDateTime
);
BONUS

If you have the latest Red Gate’s SQL Prompt installed (ver 6.4), then you will have an even easier time writing your favorite INSERT statement. They added a new feature that will highlight the field of the value you are modifying and vice versa.  Below you can see that my cursor is on line 61 and line 54 is highlighted. If I were to put my cursor on line 54, then line 61 would be highlighted.

 

SQLCoOp7_Image2

Don’t Stop Yet

Don’t forget to check out these blog posts by the rest of the SQL CoOp team on the subject of CRUD:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next time!!

On a SQL Collaboration Quest

Four SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. Along the way, they will also include other SQL professionals to join in the collaboration.

Original Post: On a SQL Collaboration Quest

SQL Server Data Transferred to a SQLite Database Using SSIS

The Sixth Mission

In this installment of SQLCoOp, we are writing about SSIS. For me, this happens to tie closely with my the June SQLCoOp post I wrote, called Data Models, SQL Server, SQLite, and PowerShell. In that post I walked through how to create a data model in ER/Studio Data Architect and have it end up in a SQLite database. In this post I’ll show how to use SSIS to get master data from a SQL Server database and insert it into a newly created SQLite database that can then be deployed with client applications.

MCJJ_ALL

The Project

I needed to create a consistent way of pre-populating an empty SQLite database to be used in new builds of our application, so I turned my development SSIS package that moved test data from SQL Server into the SQLite database into a production SSIS package that pre-populated master data into a SQLite database.

Before you get started trying this example out, make sure that you have an ODBC Driver for the flavor of SQLite that you are using installed on your computer. Here is a great resource for ODBC Drivers. After the installation is finished, setup an ODBC connection to the SQLite database that you’ll be populating.

Step One

The first step I did, was run a DELETE statement for each of the tables that I’ll be populating. This allowed me to not worry about the destination database being populated from last time I ran the package. I used an Execute SQL task for each table. I set the Connection property to my ODBC connection I created and I set the SQLStatement property to the corresponding DELETE Statement.

SQLCoop6_Image1

 

SQLCoOp6_Image2

Step Two

For the next step I created a Data Flow task for each table. Inside each Data Flow task I created an ADO.Net Source and ADO.Net Destination tasks.

For each of the tables I wanted to populate, I created a unique stored procedure in a utility database called SSISMigration. This allowed me to customize the needs of each of the stored procedures without worrying about the needs of the applications using similar stored procedures in the source database.

I set the  ADO.Net Source connection to the SSISMigration database. I set the Data Access Mode to “SQL Command” and the SQL Command text to an execute statement such as “EXEC MSSL. GetAddressType”.

Side Note: If you want to pass a parameter to the stored procedure in a generic form, then check out the Expressions property of the parent Data Flow task.

 

SQLCoOp6_Image3

For the ADO.Net Destination task, set the Connection Manager property and select the table that you would like to have populated. Then click on the Mappings menu item to verify that the columns are mapped correctly between the Source and Destination tables.

SQLCoOp6_Image4

Step Three

Now I can connect each of my Execute SQL Tasks with their corresponding Data Flow tasks. This will ensure that the delete process occurs before the insert process.

SQLCoOp6_Image5

Step Four

The first time I built this process, I had all the statements connected in one big line. This made the processing an asynchronous process. I then learned that I could have multiple processes happening at the same time. But what if I have one last process that needs to occur after all other processes are finished? This is where the awesome Sequence Container comes into play. By placing all my Delete/Insert processes into the Sequence container, I can have them run synchronously, and still have a final process occur after they have all completed.

Step Five

Finally, I need to clean up after myself. Since I did some deleting and inserting, I want to make sure that my brand new SQLite database is as small as possible. SQLite has a command called VACUUM. This command rebuilds the database to remove any fragmentation caused by CUD operations. To run the VACUUM command, I use another Execute SQL Task. I set it up like I did in step one, with the SQL Statement set to “VACUUM;”.

Now I have three processes that occur synchronously and one process that happens after the first three are completed.

SQLCoOp6_Image6

Step 6

Run the package. If I’m only making this “new” master copy of the database for development, then I might just run the package from Visual Studio on an as needed basis. If I want this to be an automated process, then I can set the package up on SQL Server to run on a schedule.

But Wait, There’s More

My example here was for populating a new SQLite database with master data from SQL Server. I could use this same process for creating a subset of data for testing needs or for creating new SQL Server databases that need to be deployed in new environments.

Don’t Stop Yet

Don’t forget to check out these blog posts by the rest of the SQL CoOp team on the subject of SSIS:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next time!!

On a SQL Collaboration Quest

Four SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. Along the way, they will also include other SQL professionals to join in the collaboration.

Original Post: On a SQL Collaboration Quest

T-SQL Tuesday #57 – SQL Family to the Rescue of a Local Community

My good friend Jeffrey Verheul(b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009. This month’s invitation is on the topic of SQL Family and Community.

I was very happy to see this topic this month since I just had an amazing SQL Family kind of day a week ago and this is the perfect venue to share it.

The Community

The beginning of this year I took a leap of faith and started a local PASS chapter focusing on BI in Irvine, CA. It’s called BIG PASS Community. I’ve been slowly growing the group steadily each month. I have a hand full of people who come every month and we usually have 1 or 2 new people as well. A couple of months ago, I was approached by a SQL Family member, Rob Hatton, who was a recent transplant from Florida. He wanted to help with my new community, so now we are co-leaders.

Before each meeting, we eat dinner together in the kitchen at our venue so that we can get to know each other better and network. Not only is our venue really nice, but Rosalyn, who is a Sales Rep there and stays late for us, is a wonderful hostess. She helps promote our meetings and helps take care of little details. We’re really blessed to have her.

The phone rang on a Friday afternoon

Rosalyn often calls me before our meetings to find out who our latest speaker is and if they will be presenting locally or not, so I wasn’t surprised to receive a phone call from her. This time she had some bad news for us. She had decided to leave the company to pursue a new opportunity, but she couldn’t find another person to host us in the evenings at their facility. We were now without a venue.

I was really worried about loosing our venue since I’ve seen other user groups unable to meet for months until a new location was found. Our group also doesn’t have any financial resources to pay for a venue either. What was a girl to do?

Twenty-four hours

I spoke with my co-leader about the situation and we developed a plan. We would change the next meeting to a networking event at a restaurant. That would allow us to still have a meeting and give us a month to find a new venue. I sent out emails late Sunday night to our community members letting them know what had happened and the new schedule for our next meeting.

Monday morning I received an email from David, who is one of our community members, “We have a classroom at work. Do you want me to see if we can use it?” I replied, “YES!” He kept updating me throughout the day with his progress on getting approval. Then I received an email from another community member named Ted. “We can use our classroom at work. We also have a nice break room for our dinners together.” We now had a venueand a potential backup venue.

This is a perfect example of the heart of SQL Family. They step in when someone needs help and lend a hand.

Building relationships

I want to share with you how I met each of the people I mentioned above.

In the spring of 2013 I was at the after party at the Orange County SQL Saturday. One of my friends wanted to introduce me to someone who recently moved near me. His name was Rob Hatton. I had the wonderful opportunity to get to know him and his lovely wife Barb better that night. We then crossed paths at two other SQL Saturdays over the last year.

Last year, at PASS Summit I helped host the Southern California User Group tables at lunch. I wanted to connect with more people in my area. We had several new people join us for lunch who weren’t aware of the local user groups. Two of those people were David and his co-worker James. We had a great time getting to know each other at lunch and was delighted to see them at many of the evening events where I had the opportunity to speak with them further. When I started the BIG PASS Community user group, they started attending it as well.

This past April was the local SQL Saturday event in Orange County. That was where I met Ted. He attended both of my morning presentation. We crossed paths again at lunch, where we had time to network further. I was able to tell him about the local user groups and encourage him to attend them. I was really happy to see him at the June meeting.

Why am I sharing this with you? Wellto show you how integrated our community is. To show you how the roots travel far. SQL Family connections don’t all start in a classroom or on twitter. They start at lunch tables, Karaoke bars, and walking between presentations.

Thanks for all the fish

Thanks go out to Jeffrey Verheul for hosting this month’s T-SQL Tuesday blog party. He is one my favorite SQL Family members who I met through my T-SQL Tuesday participation. While seven time zones and a large ocean separate us, technology has allowed us to be friends, co-bloggers, and SQL Family members.

Becoming a SQL Prompt Power User

MCJJ_ALLFour SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. This month we are cooperatively blogging about SQL Prompt by Red Gate.

The Next Mission

What makes up a SQL Prompt power user? Well, they use the SQL prompt commands like they were extensions of SSMS. They know the shortcuts, and they’ve memorized the key words to call up their favorite scripts. Let’s take a look what these short cuts are and how to integrate them seamlessly into your every day SQL Life.

Playing The Ebony Keys

A power user of any software is one who uses the hot keys in lieu of the menus. SQL prompt has a plethora of hot key combination.

Before I use <ctrl> + e to execute my code, I always use <ctrl> + k, <ctrl> + y. This hot key combo will structure your code based on the styles you have set in SQL prompt. This is my most heavily used hot key combo right after <ctrl> + e and <ctrl> + r (hiding the results window).

There is a new hot key combination that is coming out soon in version 6.4, and I’m very excited about it. It adds a semicolon at the end of each statement! I’ve been trying to add the habit of using semicolons for 2 years now, and it never sticks. Now I can use <ctrl> + b, <ctrl> + c and away I go.

Here is a table of the hot key combinations I use the most, plus the shortcut way I remember them.

Hot Key Combination Shortcut Hot Key Combo What It Does
<ctrl> + b,
<ctrl> + w
<ctrl> + bw Expands ALL of the wild cards.
<ctrl> + b,
<ctrl> + q
+ bq Qualifies all the SQL Objects and fixes the capitalization as well.
in beta <ctrl> + b,
<ctrl> + c
<ctrl> + bc Adds semicolons to the end of each statement. (Available soon in 6.4)
<ctrl> + k,
<ctrl> + y
<ctrl> + ky Standardizes the code based on the formatting styles that were set in SQL Prompt.

As a power user, my key strokes will look like this when I use them all at once:

<ctrl> + bw, bq, bc, ky and finally <ctrl> + e to execute my code. Here is an example of how an ugly duckling turns into a beautiful swan by applying all of these hot keys at one time.

Before


alter procedure [dbo].[salesteamassignment_rptugly](@group as varchar(1000), @startdate as date, @enddate as date)as
set nocount on
declare @uselessvar as varchar(max)
select
p.businessentityid ,firstname,lastname,s.name as store,[group] as TerritoryGroup,cr.name as CountryRegion,st.name as Territory,customercount,s.*
from
adventureworks2008r2.sales.salesperson as sp join adventureworks2008r2.person.person as p on sp.businessentityid = p.businessentityid
join adventureworks2008r2.sales.salesterritory as st on sp.territoryid = st.territoryid join [adventureworks2008r2].[person].[countryregion] as cr on st.countryregioncode = cr.countryregioncode
join adventureworks2008r2.sales.store as s on s.salespersonid = sp.businessentityid join dbo.fn_split(',', @group) as grps on st.[group] = convert(nvarchar(50), grps.separatedvalue)
outer apply (select count(*) as customercount from adventureworks2008r2.sales.customer as c where s.businessentityid = c.storeid ) as cc

After

ALTER PROCEDURE dbo.SalesTeamAssignment_rptUGLY
(
@group AS varchar(1000)
,@startdate AS date
,@enddate AS date
)
AS
SET NOCOUNT ON;
DECLARE @uselessvar AS varchar(MAX);
SELECT
p.BusinessEntityID
,p.FirstName
,p.LastName
,s.Name AS store
,st.[Group] AS TerritoryGroup
,cr.Name AS CountryRegion
,st.Name AS Territory
,cc.customercount
,s.BusinessEntityID
,s.Name
,s.SalesPersonID
,s.Demographics
,s.rowguid
,s.ModifiedDate
FROM
AdventureWorks2008R2.Sales.SalesPerson AS sp
JOIN AdventureWorks2008R2.Person.Person AS p ON sp.BusinessEntityID = p.BusinessEntityID
JOIN AdventureWorks2008R2.Sales.SalesTerritory AS st ON sp.TerritoryID = st.TerritoryID
JOIN AdventureWorks2008R2.Person.CountryRegion AS cr ON st.CountryRegionCode = cr.CountryRegionCode
JOIN AdventureWorks2008R2.Sales.Store AS s ON s.SalesPersonID = sp.BusinessEntityID
JOIN dbo.fn_Split(',', @group) AS grps ON st.[Group] = CONVERT(nvarchar(50), grps.SeparatedValue)
OUTER APPLY (
SELECT
COUNT(*) AS customercount
FROM
AdventureWorks2008R2.Sales.Customer AS c
WHERE
s.BusinessEntityID = c.StoreID
) AS cc;

But Wait, There’s More

Formatting your code is only half of the features in SQL Prompt. There are three other features that every power user should use. (Ok, the truth be told, there are more than 3, but my article shouldn’t be the length of a book.)

Feature 1:

SQL Prompt offers the ability to save snippets of code. This is really useful for repetitive tasks. Here is a list of articles that I’ve written talking about specific snippets that I find handy.

Placeholders are one of the most helpful features that SQL Prompt provides for you to work with snippets. The placeholders are reserved words that are variables for specific pieces of information or represent actions that can be applied to your snippet. They are always completely capitalized and are surrounded by dollar signs ($). You can find the current list for SQL Prompt 6 here.

One of the new placeholders is prefect for the power user. It allows you to insert a snippet and have part of it highlighted. Here is an example of how I like to use it. This particular snippen, which I named “st1”, will add SELECT TOP 100 * FROM in front of the table name that I”ve highlighted and copied to the copy buffer.

$SELECTIONSTART$SELECT TOP 100 *
FROM $PASTE$$SELECTIONEND$

The $PASTE$ placeholder will be replaced with whatever is in my copy buffer. The $SELECTIONSTART$ and $SELECTIONEND$ placeholders will cause the code within the placeholders to be highlighted.

To use it, I follow these steps.

1. I highlight a table I want to use and hit <ctrl> + c to get the table  into my copy buffer.

SQLCoop5_image1

 

2. I type st1 + to get the snippet. I now have a highlighted SELECT statement ready to execute, so I hit + e to execute it.

SQLCoop5_Image2

 

3. Finally, I type <ctrl> + z twice to undo the changes I’ve made.

SQLCoop5_image1

 

This allows me to quickly see some sample data from the table without ever leaving my current query window.

Feature 2:

This is a hidden gem. This feature finds all the invalid objects in your database. This is a great feature for those of you who need to perform code reviews and want to make sure that any DDL changes have not broken any existing SQL Objects.

In this example I have a table with a trigger that uses the field ModifiedDTS. I deleted the field and ran the command. It listed all the SQL Objects that are now broken due to the deleted field.

SQLCoop5_Image3

This feature does not have a Hot Key, but it is definitely worthy of a power user.

Feature 3:

This is a great “clean up” feature, especially when you are working with stored procedures with 200 + lines that you have inherited. You know the ones, they are filled with old code and have never been cleaned up. Well, this feature is perfect for it, because it finds all the unused variables and parameters. Simply use the hot key combination + b, + f. SQL Prompt, will then underline in green all the unused variables and parameters for you to evaluate for usefulness.

SQLCoop5_Image4

 

I encourage you to spend 10 minutes going through the SQL Prompt menu to find out what other goodies exist for you to leverage. There are also new features coming soon in version 6.4. Here is a great reference to all the release notes for each version, plus the release notes for the current beta version.

Don’t Stop Yet

If you want to read more about how to work with SQL Prompt, then check out these blog posts:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next month!!

Original Post: On a SQL Collaboration Quest

T-SQL Tuesday #56 – Starting Off With Assumptions

Dev Nambi (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009. This month’s invitation is on the topic of Assumptions. Dev gave some great examples for possible T-SQL Tuesday posts with Assumptions as the focus. Here is the invitation:

For this T-SQL Tuesday, the topic is assumptions. For example:

  • The sun will come up tomorrow.
  • Firewalls and anti-virus are enough to protect my computer.
  • My backups work even if I don’t restore them.
  • I don’t need to check for that error, it’ll never happen.

Your assignment for this month is to write about a big assumption you encounter at work, one that people are uncomfortable talking about. Every team has an elephant in the room.

This is a great topic, but I want to take this in another direction. I want to talk about the assumptions that should be stated at the beginning of a project.

In the Beginning

When we start a new project, we are given requirements. You might be laughing at me and saying, “Mickey, I’m never given proper requirements. I just get an email with a couple of sentences or I’m told verbally what to go do.” Well, while I agree that an email or a verbal assignment is a crappy way to be given requirements, if that is what you have, then that is what you have.

The good news is, you don’t have to leave it at that. No matter how bad of a writer you think you are, if you can expand on what little information you are given, then you will be better off in the end. And this is where assumptions come in.

Assumption
Dictionary.com
1. something taken for granted; a supposition: a correct assumption. Synonyms: presupposition; hypothesis, conjecture, guess, postulate, theory.
4. the act of taking possession of something: the assumption of power. Synonyms: seizure, appropriation, usurpation, arrogation.
Covering your behind with assumptions

Whenever I’m given an assignment, whether it is in a full blown requirements document (angels get their wings when this happens), or when I’m given some vague guidelines, I always create additional documentation. Sometimes this documentation is for me. Sometimes it is for my teammates. Sometimes it is used only as a discussion point in a meeting. No matter who the documentation is for, I always include a section called Assumptions. This section is at the beginning of the document and it lists all the elephants in the room that no one wants to look at. Sometimes these elephants are actually really drunk. (Those also get listed under a different section called Risks.)

You should list all the various pieces of information that you think are important, but are missing from what you were given. This information could be facts that you need, such as all dates will be based on UTC time. Or it could be tasks such as a new web service needs to be built by another team.

Some of your assumptions might be wrong. That is not a bad thing to have happen. If you hadn’t listed XYZ as an assumption and you coded to XYZ, then more time will need to be taken to fix your work, or worse, XYZ ends up in production. By having your incorrect assumption listed and corrected, you now know more information about your project.

Some of your assumptions will take the project owner by surprise. This is also not a bad thing to have happen. It means that you have information they were not aware of. It could also mean that they didn’t think the project all the way through. Either way they now have information they didn’t have before.

By listing out all the missing pieces of information and tasks that need to be accomplished to make your project successful, you are shedding light on the entire scope of the project. You are taking control by assuming ownership of your situation and helping the end result of the project to be a successful one.

NOTE: There is one type of document that I don’t include an Assumptions section in and that is for my SSRS Reports. Those requirements are gathered using a template so that all the answers I need are not missed. For me, this is my only exception.
Thanks for all the fish

Thanks go out to Dev Nambi for hosting this month’s T-SQL Tuesday blog party. Please visit his website at http://devnambi.com.

I’m Speaking at PASS Summit This November

MickeyFedora2014I’m very excited to share that my abstract was one of the 144 abstracts selected for PASS Summit 2014. This will be my first time speaking at PASS Summit and I just can’t take the grin off my face.

My presentation is called Techniques for Dynamic SSRS Reports and can be found in the BI track. In my presentation we’ll go over ways to add navigation to your reports, as well as how to make a single report satisfy different users needs.

I hope to see you all at Summit in Seattle this year!

Foreign Key Constraints: Friend or Frenemy?

This month has been wild and crazy for me. I’m still in Louisville, Kentucky after a very successful SQL Saturday yesterday (#286). While I’ve been in Louisville, SQL Server Pro magazine published my first article with them. I talk about all the problems that can be caused if you don’t have Foreign Key Constraints in your database. So grab some coffee and your favorite e-reader and enjoy my article: Foreign Key Constraints: Friend or Frenemy?

%d bloggers like this: