Archive for Red Gate

SQL Source Control and Git Support In the Real World

When I heard that Redgate had improved the Git support in SQL Source Control, I have to confess I got excited about it. I even tweeted about it. There’s a good reason for that too.

I fell in love with SQL Source Control the first time I used it in another company. Before it came along, we scripted out database objects, which is a pain. At one point in one company, we used the Visual Studio version, and that’s painful as well because you don’t get the immediate gratification of knowing what’s changed. You have to actually bring in the changes and then see what your changes are. I just found that very cumbersome.

SQL Source Control makes it a really easy process and defines all of your changes. It’s especially useful if you’re working on multiple projects and one project is sitting there for three weeks and you come back and look at the changes. You might think, for example, oh yeah, I was working on indexing and I should probably get those indexes into production. Using SQL Source Control, you can do it quickly and easily.

The missing link

As much as I love SQL Source Control, a niggle I’ve had has been its limited support for Git. Like many other people, I moved from Subversion to Git because its branching features are so much better, and adopted Atlassian Stash (now called Bitbucket Server) as the repository.

That’s where I had a difficulty because there was a gap between SQL Source Control and the Stash repository on the server. I tried a workaround using command prompts, but it was so painful and cumbersome I turned to Atlassian SourceTree instead. A free Git client, it provides a visual GUI to do the push and the pull between the server, Git, and the local repository.

The tweet

imageAt the beginning of October 2015, everything changed. That was when Redgate announced SQL Source Control users could now push to and pull from remote Git repositories.

And not just from within SQL Source Control – from within SQL Server Management Studio (SSMS).

I was thrilled because this one apparently small move eliminated an entire step in another application. No more closing this, opening that, moving this, quitting that, going back to the application you were happily working with before.

It also made things a lot clearer to team members who were new to using version control by taking the complexity out of the process.

You’re not out of the woods yet, Redgate

I use SQL Source Control every single day and I’ve now got my entire team checking in their changes using Git.

They can commit changes, get the latest changes, and do everything they need to do without complications. As well as saving time, it makes it easier for less technical staff who are new to using repositories. A glance at what it looks like in SSMS demonstrates how simple it is:

image

So it has made my life easier, but do you know something? There are still some improvements I’d like to see.

What branch am I on?

Top of the list would be to see which branch I’m on. Right now I can see which database I’m on, but not which branch

How many changes are ready to push or pull?

Near the ‘Push’ and ‘Pull’ buttons, it would be nice to know if there are files to pull or push, in case other stuff has been checked in from other users. A number on the buttons would work as well, to show how many items have been checked in locally and are ready to go up or come down.

The pull button is the more important one, to make sure you’re on the right version. Team keep forgetting to pull down changes that exist on the repository so this would be a real bonus.

What about conflict resolution?

Conflict resolution would be another great feature because sometimes it’s difficult. The developers don’t always remember to bring down the latest version and when they check something in, it causes a conflict. I think I spent an hour fixing that yesterday, and I still have to go back and deal with it on one of my team mate’s machines.

Right now, I use KDiff3 to merge our QA branch and personal branches to fix conflict resolution, but a feature within SQL Source Control would make things a lot simpler.

And finally

Finally, I’m looking forward to when SQL Source Control is incorporated with migration scripts. At that point, everything will become a breeze.

If, like me, you want to see some of these features added then visit the SQL Source Control UserVoice site and vote!

Adventures Into Azure Databases For The First Time (Part 2 of 2)

AdobeStock_52521593_thumb[2]In my last post, I started a two part series to learn how to create and migrate a database from SQL Server 2014 to a new Azure Database.

Recap of My Goals

In Part 1 of this series, I completed the following goals:

  • Activated my free credits through my MSDN license.
  • Created my first empty database.
  • Determined how to connect to the database.

In this post (part 2), my goals are:

  • Poke around the Azure GUI to learn what is available with my free credits.
  • Use my Red Gate tools to create a new schema and load it with data.
  • Learn if I can take a backup from SQL Server 2014 and restore it in my Azure account.
4. Poking Around the Portal

When I was poking around, I found some great features. I discovered these screens by clicking on the database name on the Portal screen (figure A), where I found a series of links at the top that you can click on.

11-29-2015 5-37-20 PM_thumb[1]

 

There is a monitoring screen, which also has some helpful metrics at the bottom, like dead locks. This is an important screen. You don’t want to max out your resources or you’ll be considered a “noisy neighbor” and Microsoft will do something about it.

11-28-2015 5-00-57 PM_thumb[1]

I also found the Scale Out page. This page will allow you to increase or decrease the size of your database. This can be very helpful if your database slowly changes size throughout the month. By changing the service level or the DTU’s to only what you need, you can save money.

5. Upload A Database

I have an existing Database on my SQL Server 2014 server that I want to migrate to my Azure account. I want to first move the schema, then the data. I have always trusted my Red Gate tools to create reliable, deployable scripts. The two tools I’ll be using are SQL Compare and SQL Data Compare.

SQL Compare will compare my existing SQL Server 2014 database schema with my new Azure database schema, which is currently nonexistent. The reason why I like this tool is that it creates serializable transactions for deploying all the individual scripts as one script. If the script fails at any point, then the whole script is rolled back. The second tool, SQL Data Compare, will be used to move the data. I don’t recommend doing this on a gigabyte of data, but for my small database it will work great. It is also a good tool for moving all the values of lookup tables, or for resetting development data to a previous state.

When you open up SQL Compare, you are presented with a screen to set up the two databases. The left hand screen has the database with the SQL Objects you want to deploy and the right hand side has the database you want to deploy to. After you enter the server and login credentials, SQL Compare connects to the Server to obtain the list of databases.

At this point, I learnt another lesson. When I went to get the list of databases from my Azure Server, I was given an error message. It stated that my IP address was not allowed access to the Azure Server. Back to the Portal, I went and I followed these steps:

  1. On the Portal, I clicked on the server name.
  2. I clicked on Configure in the menu bar under the server name.
  3. I was presented with a very helpful page. It told me which IP address was currently accessing the portal. At this point, I clicked on the link “Add to the allowed IP addresses”
  4. Finally, I clicked save at the bottom.

11-29-2015 8-05-39 PM_thumb[1]

 

I went back to SQL Compare, and I was able to retrieve the list of databases from my Azure server. Win!

image

When I clicked the Compare Now button, SQL Compare will compare the two databases and provide a list of deltas. The deltas are broken up into differing groups. The schemas exist in both databases, but differ. One group consists of the schemas that only exist in the database that will be deployed, and the other consists of the schemas that only exist in the database that will be receiving the deployment. You are able to select only the changes that you want to deploy.

image

In this database, there are only two tables and one function that are relevant for my Azure database, so I selected them and then I clicked on the Deployment Wizard button at the top. The next screen asked me if I want to have SQL Compare perform the deployment or to create a deployment script. I chose to create a deployment script. I’m then directed to the screen below.

image

It showed me the script that I can use to do the deployment. When I clicked the “Open Script in Editor” button, a marvelous thing happened. SSMS opened, a tab opened with the script I will be deploying, AND I had been connected to my Azure database with the same credentials I used to create the script.  It also opened the correct database so that I was not left in the Master database.

image

After I executed the script, I had a database schema in the new Azure Utility database.

image

 

Now to get some data.

The SQL Data Compare works exactly like SQL Compare, but it is comparing all the data between two identical (or mostly identical) tables. Note: If the tables don’t have primary keys set, then you can select the field that should be the primary key.

image

Here you can see that I have two tables that I can compare. The ExpandedDate table has +44K records and my Tally table has 1 Billion records to move. The following steps are just like SQL Compare and I can choose whether SQL Data Compare deploys the script or creates a script to be manually deployed.

Since this was my first time migrating data from my local database to my Azure database, I learnt a few things. One, a deployment script with over 1 billion inserts, does not work. (I’m not surprised.)

Two, it took the SQL Data Compare tool a long time to migrate the data. I’m sure my home internet connection (with teenagers using the bandwidth) didn’t help.

6. Restore a local database backup to my Azure Database

While I waited…and waited for my Utility data to be inserted into my Azure database, I did some poking around to see if it was even possible to restore a local SQL Server 2014 backup to an Azure database. Guess what, I found something (And there was much rejoicing).

On CodePlex, I found a SQL Database Migration Wizard that can be used to restore the database. They even had Migration Wizards for 2008R2, 2012, and 2014. SQL Database Migration Wizard v3.15.6, v4.15.6 and v5.15.6

In the post, they also listed under their Reference Materials a “Migration Cookbook” that you can download and use. Migration cookbook now available for the latest Azure SQL Database Update (V12)

Recap

When I first started this two part post, I had only taken classes on Azure topics, but I had never gone through the steps of using the Azure portal. Now I’ve gone on a journey of completing a list of goals, which started with the creation of an Azure account and ended with loading a schema and data to a new database.

I hope you enjoyed this journey and will take a journey of your own exploring what Azure has to offer.

Adventures Into Azure Databases For The First Time (Part 1 of 2)

New Construction Home High Ceiling Wood Stud Framing

Whether we want to accept it or not, cloud computing is here to stay. Microsoft has made a big push in that direction for a few years now. When I first started using Office 365, I wasn’t sure I would like it. Now I love it. I love being able to access Word, Excel, and Outlook from wherever I am.

Now I want to do the same with my databases. To do that, I need to take the plunge and learn how to work with Azure Databases. If I don’t, I’ll fall behind, which is something I don’t want to do.

So let’s get started with my first Azure Database.

Ready. Get Set. Go!

Whenever you start a new project, you should set goals, and my experiments in learning about Azure Databases is no exception.

In Part 1 of this series I have the following goals:

  • Activate my free credits through my MSDN license.
  • Create my first empty database.
  • Determine how to connect to the database.

In Part 2 of this series:

  • Poke around the Azure GUI to learn what is available with my free credits.
  • Use my Red Gate tools to create a new schema and load it with data.
  • Learn if I can take a backup from SQL Server 2014 and restore it in my Azure account.
1.  Setup Azure Account

The first thing you need to do is activate your Azure Account through your MSDN License. This only took minutes. I entered my contact information, then Microsoft sent me a text with an activation number, and I was set.

There were lessons that I learned though.

  • The Azure account uses the same login as Office 365, so I ended up setting up my Azure account for work the first time. Oops.
  • There is a difference between the Free Azure Credits for the Professional MSDN license and the Enterprise MSDN license.
  • The Professional MSDN license provides $100 per month free credit, while the Enterprise MSDN license provides $150 per month free credit.
  • You can pay for more credit, but I didn’t need to.
  • You can find the details here for all MSDN and Visual Studio licenses. Note: These credits are for development and testing only, not for production use.
  • The free credits can’t be used to work with third party products like Oracle. (I’m good with that.)

If you do have multiple Azure Accounts, you can verify which one you are currently logged into in the top right hand corner of the portal. Now that I’m paranoid as to which account I’m in, I’ll be watching that corner for my private email or my work email.

Portal

2. Setup My First Empty Database

Setting up my first database was as easy as setting up my Azure Account.

At the bottom of the portal, there is a New link and a Delete link. These are for creating and deleting databases.

After clicking the New link, I went through a series of screens to create my database.

The first screen asked me for the name of my database and what size database I wanted to create. This is an important step, since it will affect my monthly charges. Remember, I only have $150 in free credits each month. You can go here to see the pricing for the various service tiers and the performance levels. I chose to create the smallest database I could (2 GB, and 5 DTUs). I also created this database on a new SQL Database Server (I kind of have to, since it is the first database).

11-28-2015 4-24-28 PM

The second screen I was shown was for the login credentials of my Azure Server. I created a login name, a strong password, and where I wanted my server to be stored. Microsoft has locations all over the world that house the Azure servers. While I can’t pick a particular location, I can pick a particular region.

11-28-2015 4-27-29 PM

Now I have an Azure Server and Database. By clicking the Database Link or the Server link, I can see my Databases and Servers.

11-29-2015 4-48-49 PM

Figure A

There is another portal that you can look at as well. You can find it by clicking on your login name and selecting Azure Preview Portal. (This is in preview at the time of writing this article.)

There are a couple of different ways you can look at your servers and databases. One is a tile layout and the other is a traditional list layout.

11-28-2015 4-47-48 PM

Figure B

11-29-2015 4-55-29 PM

Lessons learned

  • I don’t get to pick the name of my server, but I’m ok with that. It will provide better security.
  • If I’m moving a database to Azure, I should probably provide the same name as my current database. This means I get to create two new databases, AWMonkey and Utility, which will be the two databases I’ll be working with. I’ll also delete the AzureMonkey database since I don’t need it.
3. Find the Azure Server and ODBC strings I’ll need to use to access from my applications on my laptop

This was very easy. I remember the 90s where I would bang my head on the table trying to figure out how to put together an ODBC string the first few times. (This was the stone age where Google didn’t exist. I know, horrifying.)

  • If you are on the current portal page (Figure A above), then you double click on the database name. Towards the bottom you’ll see the string that will be used to connect to your Azure Server.
  • By clicking on the link for the connection strings, you’ll be given four different connection strings. You just need to provide your user account credentials.

11-29-2015 5-06-58 PM

If you are in the preview portal (Figure B above), you can click on the name of the database, which will bring you to a property page, which has the string for the Azure server. Then you can click on the link that will provide the connection strings.

11-29-2015 5-19-15 PM

Until We Meet Again

Even though I’ve only created an empty database so far, I feel like I’ve accomplished a lot. I remember trying to get SQL Server 7.0 setup. It took two of us half the day and a fifteen step cheat sheet to get it installed. (Yup. Still no Google.)

Note: Setting up an Azure Account and an empty database took less than 10 minutes. In fact, it took less time than it took me to write this post. See you soon for the next post.

My Eleven Day PASS Summit 15 Experience

“STUEWE!” This is how I knew PASS Summit was about to begin. I was walking to the Friday night SQL Saturday Portland speaker dinner, prior to PASS Summit, when I heard my name being called, well, yelled. Since it was already dark, it took a minute to spot the black SUV with Mike Fal hanging out the window waving at me.

But wait, that is not when my Eleven Day Summit Experience started.

Day One – Thursday

It actually started on Thursday before Summit at LAX. I was there bright and early to pick up two of my Australian friends, Martin Cairney and Ben McNamara, who would be traveling with me to Portland and then on to Seattle. Despite the 14 hour flight, they wanted to go see the Endeavor space shuttle at the California Science Center in LA.

IMG_4924

Day Two – Friday

Friday was spent flying to Portland where we would be attending SQL Saturday Portland. We stayed in an Air B & B house. This is a great way to save some money for lodging and to feel like you’re at home. There is always a coffee machine with decent coffee and the best…no rambunctious kids on the floor above you. WIN!

Friday night, we attended a wonderful speaker dinner (This is when Mike Fal yelled my name out a window of an SUV). Speaker dinners are always fun to attend. You get to catch up with friends and make new ones. Portland is always a bit special, because it’s full of speakers from around the world who made a little extra time to speak at a community event before PASS Summit.

After dinner, we found an amazing restaurant. It had the most scrumptious gluten free bread. I think it was made of clouds.

IMG_4938

Day Three – Saturday

Saturday was spent at one of my favorite SQL Saturdays. Ok, I’ll be honest. ALL SQL Saturdays are my favorite. It’s a great time to do a little extra networking, get to see SQL Family, and meet new people in the community. I also love speaking and that is what I did in Portland. I spoke on SSRS.

2015 PASS Summit12

I was able to attend several wonderful sessions. My favorite was given by Julie Koesmarno and Cindy Gross: Moving beyond Unconscious Bias. I really liked their approach to this topic. Throughout the presentation, they strongly emphasized that we are all good people.

They spoke about how everyone categorizes everything in our lives, but sometimes we have a bias in our categorization, an unconscious bias that we are unaware of. They told us about a study by Harvard called the Implicit Association test, which tests how we categorize various topics. Julie and Cindy then showed us a video of Allen Alda taking the Harvard test. I strongly recommend learning about unconscious bias.

No Portland trip is complete without going to my two favorite places in the evening. We put our names in for our two hour wait at the Multnomah Whiskey Library, and then we headed over to Cassidy’s. They have great food and AMAZING bacon. I ordered enough bacon for all sixteen of us. (And there was much rejoicing). Afterwards we went to the Whiskey Library, where I was finally able to try some Scapa Scotch. I really liked it (Thanks for the recommendation Grant!).2015 PASS Summit13

Day Four – Sunday

One of the cheapest ways to get from Portland to Seattle is the train. When we bought our tickets, they were around $25. Little did we know it would be an unexpectedly amazing day. My friend, Ted Stathakis, was really looking forward to this day too. He loves trains; neither of us realized how amazing it would be.

Last year, there were only five of us on the train. Martin Cairney and I spent the majority of the trip troubleshooting a problem with my VM. Not this time. This time it was completely non-technical. There were sixteen of us in our car, and two who (foolishly) purchased tickets in a nicer car. Why do I say foolishly? Well, we were having so much fun, that they spent the majority of the trip in our car…eating our Voodoo Donuts. Next time, I think they will buy the general tickets. Next time, I’m also going to try getting the WHOLE car too. So if you want to ride with us, keep an eye out for a post from me in the summer.

2015 PASS Summit4

For more details on the fun we had, go look up the twitter handle #SQLTrain.

Day Five – Monday

Monday was a red day. Red Gate day that is. Red Gate puts on a free event called SQL in the City. They have put it on each year on the Monday before the PASS Summit for several years now. They bring quite a few people from the UK to talk about their tools, and they have amazing presentation. This year, they went with a common theme: Continuous Integration. I participated in the Lightening Talks. I spoke on how to use their DLM Dashboard. This is a great tool that keeps an eye on the databases you connect to the DLM dashboard. You can monitor who modified the database and which scripts they ran on the database. Did I mention this is a free tool?

2015 PASS Summit5

Monday night is the annual Networking dinner put on by Steve Jones and Andy Warren. This is a “must attend” event. Why? It’s all about networking and integrating 1st timers into the community. This year, a bunch of us crammed into a booth with four 1st timers. At least two were from Europe. I enjoyed getting to know them and they enjoyed being introduced to all the people that came by our table to say hi.

Note: Remember, networking isn’t just about talking tech. It’s about building relationships so that you can ask technical questions in the future. If you need ideas on how to meet people, take a look at my blog post, Top Ten Ways To Create Your Meet and Greet List For Summit.

Day Six – Tuesday

Tuesday was a relaxing day before the busyness of Summit. I started the day off with breakfast with Chris Yates. This is a rare pleasure. Chris and I met through the SQL Community on Twitter. We now blog together occasionally and I speak at the SQL Saturday in his hometown.

Before I took time to prep for my Wednesday morning presentation, my apartment mates and I went to the Starbucks where they roast all of their coffee beans. I had my first french press, and, I must say, I’m hooked. 2015 PASS Summit2-001

After I was done with my presentation preparation, it was time for the opening event for PASS Summit. This is a whirlwind event. I feel like a butterfly, fluttering from conversation to conversation. This year, I felt like a match maker too. I had three people from my company attending with me. I haven’t had that happen since the mid 90s! I enjoyed introducing them to a number of people.

I also did something different. I went looking for the lone 1st timers. The wallflowers who don’t know who to talk to. I found one. He was eating by himself near a wall, watching everyone by himself. I went up to him, introduced myself, and got to know him. I then asked if there was someone he wanted to meet. He wanted to meet Pinal Dave from India. Pinal and I know of each other, but we’ve never had the pleasure of shaking each other’s hands. So that’s what we did. Luckily, Pinal was standing nearby. Pinal and I finally shook hands, and I introduced him to the 1st timer. That made me very happy. Win!

2015 PASS Summit14

Day Seven – Wednesday

This was my day <deep breaths inserted here>. I was first up to bat <deep breaths inserted here>. I’m nervous as I walk to my room. I see my room and I get calmer. I climb up on the stage and I get calmer. I get my laptop set up and I’m ready to go. Boom.

As you’ve probably noticed, I really believe in networking. My class filled up with plenty of time to spare before the presentation, so I did the same thing I did last year. I had everyone stand up and introduce themselves to their neighbors. I stood on stage with a huge grin on my face, then I stepped off stage and introduced myself to people in the front row.

200 hundred people came to hear my presentation on Sophisticated Techniques in SSRS. I enjoyed every minute of the session. We did have a couple of exciting moments in class though. There was a laptop on the table (not mine) that started beeping. I thoroughly enjoyed slamming nicely shutting the laptop lid. There was also a crash of dishes behind me in the hallway. I said “Opa!” in my head and kept on presenting.

2015 PASS Summit15

With my presentation done, I was able to enjoy classes the rest of the conference.

One class I took was not found in a classroom, but on the edge of a planter on the ground floor. I started talking with this guy (sorry I can’t recall his name, but he was awesome). He was telling me how he had submitted his session, but wasn’t selected. After he told me his topic, I told him I would have taken his class. His presentation was right up my alley. So he asked if I would like to see it. I said yes and we had a wonderful one-on-one discussion, complete with demos on his laptop. Right there. On the edge of a planter. It was my favorite class. THAT is one the many special things about PASS Summit.

One of the things I was really happy about this year was the change in how special diets were handled. In the past, they have always messed up my meals. I have a lot of food sensitivities, so I understand it’s difficult to accommodate my diet, but I still need to eat. This year, they had our names on our meals, and they had the more common special requests handled in the regular food line. (Thank you)

Day Eight – Thursday

I was invited again to live-blog the keynote on Thursday. You can see my comments here. I was very sad to hear that this would be the last keynote for PASS Summit given by David DeWitt and Rimma Nehme. I’m happy that I’ve been able to see two of their keynotes.

2015 PASS Summit Notes1

Before lunch, I thoroughly enjoyed directing people to lunch (I’ll tell you a secret. I think I enjoyed twirling my sign a little too much. Open-mouthed smile).

Thursday is the Women In Technology Day (WIT). WIT puts on a great lunch, and I was in the perfect place to direct people to the regular lunch and the WIT lunch. One of the traditions of WIT day is for men and women to wear kilts. This tradition was started by Grant Fritchey many years ago. This was my fourth year participating.

2015 PASS Summit16

Day Nine – Friday

Friday is full of sessions and bitter sweet goodbyes. Now, I don’t want you to think I only had fun this week. I did attend sessions. I have proof. I also took notes.

2015 PASS Summit17

 

… And I had fun.

 

2015 PASS Summit9

Summit is not complete without at least one night hanging out with Jason Strate at Bush Garden participating in SQL Karaoke. I was super happy to make it there multiple times this year. Last year, I wasn’t able to go at all due to all the commitments I had at Summit.

2015 PASS Summit8

Day Ten – Saturday

Saturday was a vacation day. I spent the day with my Australian friends. These are friends that I’ve made through Summit. I even worked with a some of them for a few months. In the evening, Julie Koesmarno and I threw a dinner party for a few of our friends who were still in town. We were even able to invite some friends we met this year.

2015 PASS Summit19-001

Day Eleven – Sunday

Sunday I had to come back to reality on my flight home. While I had an amazing time in Seattle speaking, learning, networking, and reconnecting, I was happy to see my family and they were happy to see me.

LucybyVictoria

Exploring SQL Prompt 6.5

010The SQL Prompt Team has been working hard at the Redgate campus in England, and they have out done themselves yet again with their latest release, SQL Prompt 6.5. I’m extra excited about this release, due to the fact that several of the features I’ve been wanting for a while are available this time around. In this post, I’m going to go over my top three favorite new features.

Tab Coloring

I was so happy to see this feature move from the Experimental Lab List to the Feature List. It was like watching Pinocchio become a real boy. Not only does this feature behave the way it did when it was on the Experimental Lab List, but they added some additional cool features to it.

So what does it do for you? It colors the tabs of each query window based on environment settings that you set up. You’re probably thinking, “Yeah, Mickey, but SSMS already does that in the status bar for me.” Well, the Tab Coloring in SQL Prompt is way better. Here is why.

I rarely look at the status bar, but I’m always looking at the top 20% of the screen, so for me, a very visual person, having the color in that top 20% of the screen is much more effective. SQL Prompt provides that for me, by coloring the tab of the query window and adding a line of the same color below the status bar. If you undock the query window, then the whole window is outlined in the tab color.

SQLPrompt6_5a

 

You can create as many environments as you want and you can pick any color…or shade of color that you want. This was one of the improvements to Tab Coloring from when it existed in the Experimental Lab list.

To modify the environments, follow these steps:

  1. Go to the SQL Prompt menu and select Options.
  2. Select Colors from the Tabs list on the left hand side.
  3. Click on the Edit Environments button in the bottom right hand corner, under the table.
  4. Add or modify an environment.
  5. Click on a color to modify the color to meet your needs.
  6. Apply these environments to the servers they represent on the previous window.

SQLPrompt6_5c

 

I’m notorious for saving stored procedures in the Master database in development. To help prevent this, I can set up a Tab Color for a specific database on a specific server. For me, that would be the Master database in dev. I love this!

To do this, follow these steps.

  1. Go to the SQL Prompt menu and select Options.
  2. Select Colors from Tab list on the left hand side.
  3. Click on the Add Server/Database link in the table.
  4. Add a Server Name in the first column.
  5. Add a Database Name in the second column.
  6. Select an Environment for the color in the third column.

SQLPrompt6_5d

Note: In the above example, all three colors are for the same server. This means that when the data is not Utility or Master, then the tab coloring will be green…even for new databases on that server.

In Tab History, you can see which environment each query ran on last. In the example below, I can see that I ran SQLQuery6 in the Master database on SQLDemoMonkey. I know this due to the fact I set up a database specific Tab Color for the Master Database on SQLDemoMonkey. (I guess I should go make sure I didn’t create a stored procedure in that database.)

SQLPrompt6_5b

Intelligent Intellisense for GROUP BY

One of the boring tasks for writing a GROUP BY statement, is re-writing all the columns that you already wrote in the SELECT section of your query. Many times, I’ll copy what I wrote in the SELECT section and paste it under GROUP BY …but then the list still requires editing. Not any more. SQL Prompt now provides an intelligent intellisense for GROUP BY. Take a look:

 

SQLPrompt6_5f

 

In this query I used Ctrl + to get the intellisense to pop up for my GROUP BY. At the top of the list, there is a shortcut for having all the non-aggregated columns inserted for you. Also notice that RequestType from line 42, is not listed at the top of the intellisense box, because it is an aggregated field.

Format Actions and <ctrl> + k, <ctrl> + y

I wrote a post, back in July of 2014, about SQL Prompt formatting shortcuts.  I wrote:

As a power user, my key strokes will look like this when I use them all at once: + bw, bq, bc, ky and finally + e to execute my code. Here is an example of how an ugly duckling turns into a beautiful swan by applying all of these hot keys at one time.

That long list of key strokes is now obsolete. (And there was much rejoicing). I now can select from a list of formatting options that will be applied with only the Ctrl + K, Ctrl + Y shortcut. This is so much easier!

To modify the list of formatting options, follow these steps.

  1. Go to the SQL Prompt menu and select Options.
  2. Select Actions under the Format list on the left.

 

SQLPrompt6_5e

 

It’s  A Wrap

I only listed three of the new features that SQL Prompt 6.5 has. I encourage you to take a look at the Release Notes for the full list. If you don’t have SQL Prompt, then download a free 14 day trial here. If you already have SQL Prompt, then you can go to the SQL Prompt/Help/Check For Updates menu item to get the latest version.

Questions Answered From Presentation (Part 1): Creating SSRS Reports Efficiently Through Best Practices

This past Tuesday, March 3rd, 2015, I had the privilege of speaking for Pragmatic Works. For the month of March, they are highlighting Women in Tech and have the whole month lined up with female speakers on each Tuesday and Thursday. I was the first in the line up for the month and it ended up being my largest audience to date. There were 419 people listening in. I even saw some comments about there being others who couldn’t make it. Wow

Thanks to everyone who came to listen to my session. While I couldn’t hear anyone, I enjoyed reading the questions and comments that were sent to me afterwards, and let me tell you there were quite a few. Below you can find the answers to HALF the questions that were asked. The other half of the questions will be answered next weekend.

Questions Answered

1. The number one question that was asked was: Will your documents, like your requirements document, be available?

Yes! You can find all the assets to the presentation under the Resources\Presentation menu on my website. I list all my presentation on this page, so just scroll to the correct presentation title.

I did get some comments about my SSRS templates missing. That has been fixed. Here are the three separate downloads:

If you missed the presentation, don’t worry. Pragmatic Works will be adding it to their catalog of past presentations here.

2. Hey, can you please share us that sample database?

I’m not sure which sample database this was pertaining to. Please email me.

3.  Would you give that mock up tool again?

It’s called Balsamiq. They are a wonderful company with a great user support forum. You can find tons of templates on their website and get your feet wet with a trial version.

4. Do you have any scripting that can be used to see what subscriptions are set up using all the various parameters?

I broke this question up into two questions because I wasn’t clear if you wanted existing subscriptions or executed subscriptions.

Executed

The ExecutionLog3 view in the report server database will have this information for you. You can filter the data on the RequestType field to show you only subscriptions. The Format field will tell you how it was rendered and the Parameters field will tell you what all the parameters were set too. The down side, is they use special escape characters, so you’ll have to decipher it.


DECLARE
@StartDate DATETIME = '03/1/2015'
,@EndDate DATETIME = '04/5/2015'

SELECT
InstanceName
,ItemPath
,UserName
,RequestType
,ItemAction
,TimeStart
,Status
,Format
,Parameters
FROM
ReportServer.dbo.ExecutionLog3
WHERE
ItemPath NOT LIKE '/DataSets/%'
AND TimeStart >= @StartDate
AND TimeStart <= @EndDate
AND RequestType = 'Subscription'
AND ItemPath <> 'Unknown'

Existing  

For existing subscriptions, you can look in the Subscriptions table in the ReportServer database. This lists the subscriptions that are set up, how they will be rendered, and the parameter values that will be used as well as other information.

5. Is there is way to bulk change the owner of a subscription? We have people leave and then when their account is deleted, the subscriptions stop working.

I’ve been asked this before. The information is stored in the ReportServer database, but I am always cautious about directly modifying this database. So , if you change it, then test it heavily  in development before applying the changes to production.

To help with this problem in the future, I would recommend a new standard. All subscriptions should be directed to Active Directory distribution lists. Even if the distribution list has only one person in it. I try to use existing distribution lists, but when I can’t, then I work with the Active Directory Admin (or whoever wears that hat). You could make a distribution list for a specific title, position, or category of people. I also made sure the description on the distribution list was filled out and included something like, “Do not delete this distribution list. It is used for subscriptions.”

6. What tool do you use for your report catalog?

There are several ways to create a report catalog. I started with an Excel document. After I outgrew that, I ended up creating a custom work item in TFS (Team Foundation Server). This allowed me to catalog the report, keep track of the workflow as the report went through its life cycle, and associate work items and bugs to the report.

Here is a blog post by Ted Gustaf on creating custom Workflows in TFS 2010.

7. Don’t you think that its better to get the Requirements and have them sign off on That ?? Prior to a Mock Up ?

Yes, it would save us some work too. Unfortunately, I found that the end users have a hard time verbalizing what they actually want. I find that the changes they request after seeing the mock-ups are not normally based on scope creep, but on the way they verbalized their needs when they initially gave them. This can be due to them not understanding what they really want, to them, assuming we know their data as well as they do. By creating the mockup, they have something tangible to look at. It’s analogous to telling a realtor that you want a three bedroom house, but the first one they show you is not what you were looking for. Now that you’ve seen the house, you can better articulate what you really want in a house.

8. Did you discuss the data quality requirements with your customers?

This is a very good question, and a delicate one. Some customers are very glad you bring this question up, and some customers assume all the data is accurate and will become concerned with past reports if they think the data is “dirty”.

Here are some scenarios and how I handled them. 

  • I was replacing an existing report that had bad data. This is an example of a delicate situation. My manager ended up coming with me to the meeting and backing me up. I showed samples of raw data under both reports and explained why the data was inaccurate before. Always save the explanation and data in an easy to locate place because they will be back asking again why a previous report is so different than a current report.
  • I had to aggregate data based on key phrases in a text field. This is an example where the customer is expecting to have a dialog about the quality of the data found. I spent some time looking at the raw data that did not match her phrases exactly and provided recommendations on additional phrases.
  • One of the requirements I was given for a report was based on ranges of numbers. For this report, I approached my client about the data I found that did not fall into the ranges he was expecting. In this case, it uncovered the need to validate the data when it was entered, so the client was grateful.

The bottom line is, be careful how and to whom you bring this question up to. It is a very important question and should always be addressed. It just may need to be addressed by your team instead of with the client, directly with the client, or in a very subtle manner with the client.

9. How do you place the Tablix color palette in the template form? And, could you provide, in your blog, info about Microsoft’s suggested methods of maintaining tracking info.

Below are the steps for creating a “color palette” on a new template. For your second question, I don’t know if there is an article from Microsoft for best practices on tracking info. In next weeks post, I have examples of how I did it.

Steps for adding your own color palette to a template

  • Add a table control (Tablix) to the body of a new report.
  • Remove the header row.
  • Make sure there is a column for each color in your color palate.
  • Make the width of each cell 0.2 in.
  • Change the background property of each cell to the name or hexadecimal value of your colors.

DeGraeve.com is a really cool website that can be used to create a color palette from an image.

Awesome quote from one of the attendees:
A formal report request process gives the requestor an opportunity to think closely about what is really wanted. Does not force it to happen, but increases the likelihood that a good and complete request will be done.

10. What do you use for Version Control?

I have used both Team Foundation Server (TFS) by Microsoft and Subversion, which is an open source product. There are a few versions of Subversion out there. I used the Tortoise version.

For maintaining my SQL source code easily, I also use Redgate’s SQL Source Control product. It’s like a bridge between your repository (TFS, Subversion, etc.) and SSMS. I am happy to talk to anyone on Redgate’s products. Just send me an email. (No, I don’t work for them. I’m just addicted to their products.)

11. Is it better to use Visual Studios of Report Builder?

Better is not the word I would use. More Powerful is the adjective I would use. Report Builder only has a subset of features that you have in Visual Studio. It was designed to empower “power users” and to provide “self service BI solutions”. If you are designing reports all the time, I would highly recommend using Visual Studio. To see the power of what you can do in Visual Studio, take a look at my Scalable SSRS Reports Achieved Through the Powerful Tablix. It’s recorded on Pragmatic Works website.

Here’s some additional terminology for you with regards to the different versions of Visual Studio. The version that comes with SQL Server is all that you need to design reports. For SQL Server 2012 and 2014 it’s called SQL Server Data Tools and commonly referred to as SSDT. For SQL Server 2008 and 2008R2 it was called Business Intelligence Development Studio and commonly referred to as BIDS.

12. What’s the major difference of standalone SSRS and the SSRS on SharePoint server?

I have never worked with the SharePoint version of SSRS. From what I know, the major difference is how the metadata for the management of the reports is stored in SQL Server. The SharePoint tables are used instead of the ReportServer database. I don’t think the creation of the reports is any different.

13. Can excel templates be imported or easily developed so that if I export to excel it looks like the data came from excel?

I’m not aware of the ability to bring in an Excel templates as an SSRS template. If the user exports the report using the Excel render type instead of the CSV render type, then the report should look like it came from Excel. You might need to email me an image of what you are trying to accomplish for me to answer this question better.

Next weekend I will answer the rest of the questions from this presentation.

Thanks for all the fish

Thanks go out to Liz Hamilton at Pragmatic Works for moderating my presentation. She is wonderful to work with and I loved the movie trivia that she did prior to the presentation.

This is me sitting at a local café with my oldest daughter sorting, compiling, and answering questions. (I think she’s doing Latin homework. I’ll stick to SQL andSSRS.)Cafe201503

Summit 2014 Experience Or My Week With the Aussies

Summit may officially only have three daysfive days if you count the two pre-con days, but my “Summit 2014 Experience ” this year was two weeks long. How can that be? Well, I hung out with my favorite Aussies.

(I will apologize right here. This is a looong post. I created a timeline for those that want the short version.)PASSSummit2014Experience

JulieAndMickeyAtFrys#SQLAussieFamily comes to visit

Some of you may know that I’m really close friends with Julie Koesmarano and she lives way too far away. So I asked her to come visit a week before Summit. That is when my “Summit 2014 Experience” officially started, when Julie arrived on my doorstop. After she settled in, we headed out to lunch, and then shopping at Fry’s for adapters. By Thursday (before Summit) two more Aussies came to visit, Martin Cairney and Ben McNamara.I hadn’t met Ben before, so he gets to be the official first networking opportunity of my “Summit 2014 Experience”. He’s also the one I spent the most time with during Summit. The next day we hopped on a plane heading to Portland and their SQL Saturday.

Portlandia

I’m born and raised in California. We drive everywhere. Aussies, on the other hand, do things differently, and I loved it. We didn’t stay in a hotel. We used Airbnb and stayed in an old Victorian parsonage. It was wonderful. We could have taken a taxi to the house we rented, but instead we took our suitcases on a tour of Portland via the light rail, the street cars, and our feet. We even picked our dinner location based on walking distance from our house. So much fun.

Since I don’t want to leave any of my “flat mates” out, we were joined late Friday night by the lovely Heidi Hastings, straight from Australia.

SQL Saturday #337 – Portland

I love SQL Saturdays. Attendees get to learn for free, I get to teach about the things I love, I get to meet new people, and to top it all off, I get to see and hug all my SQL Family. At this SQL Saturday, I gave my presentation called: Changing Your Habits to Improve the Performance of Your T-SQL.

SQLSatOregonSpeakerRoom

I found my favorite moments to be in the bathroom. I know, you’re scratching your head, but it’s true. While I was heading out of the restroom, I was stopped by one of the other speakers. She introduced herself to me and told me how she has been reading this very blog for the last two years and she has enjoyed watching me grow. Wow, that touched me. I’ve never been told that before. It made my day.

Believe it or not, my Aussie friends and I were the very LAST people to leave the event. Mostly because we needed to wait for a taxi and we waited to the last minute to call one. Once our taxi arrived, we headed for dinner. Someone had found a restaurant/bar that had whiskey called The Library. WOW! If you like whiskey, and you’re in Portland, you need to go there. They had ladders to get to the top shelves!

TheLibrary

IMG_3202Make sure you get reservations though. We had to wait 1.5 hours! Which we did at a wonderful restaurant called Cassidy’s. The food and company were wonderful. They even let me order a plate of bacon as an appetizer.

All aboard!

Sunday we needed to get to Seattle, so Neil Hambly joined our little Aussie Posse and we took a ride north on the train. This was a very relaxing train ride for Ben, Heidi, and Neil. Martin and I spent most of the time trouble shooting some domain controller issues on my laptop. In the end, Martin prevailed, and we all left the train happy. (And there was much rejoicing.)

TrainStation

 

IMG_3205Red Gate had a speaker dinner that night, just as I arrived into town at a wonderful restaurant called Tango’s. I have NINE different foods that I can’t eat. Annabel (from Red Gate), gave the chef my list of allergies, and they made me a custom meal. It was SO delightfully delicious. Here is a picture of what I considered my desert. A salad with fruit and BACON! Yum!

SQL in the City

IMG_3210Monday was spent with my favorite vendor, Red Gate. They put on an amazing free event called SQL in the City. I was honored to be a speaker for the 2nd year in a row. This year I had two presentations. I gave a lightning talk at the end of the day called: Finding the delta with SQL Compare and backups. I also gave a full session in the middle of the day called: Customize your faux test data with SQL Data Generator. I was very pleased with this presentation. I showcased a tool that isn’t talked about very often and demoed the new features that have been implemented over the last year or so, including the ability to use Python to better customize the faux test data. It was quite a bit of fun to present.

While I was wondering networking, I ran into John. The two of us participated in the First Timer’s dinner that was put on in 2012. He told me that he had been reading my blog ever since then and that he enjoyed watching me grow from a first timer at Summit to a First Timer Speaker at Summit.

Has Summit started yet?

No, Summit hasn’t officially started yet, but networking has. After networking throughout the day at SQL in the City, I had dinner with a couple of friends on our way to the beloved Tap House, whereI did more hugging and networking.

SSIS Pre-con and Mickey

Every day of this amazing “Summit 2014 Experience”, I was giving back to the SQL community. Either through speaking at an event or doing some other volunteer duty. But TuesdayTuesday was for me. I took a pre-con from Brian Knight and Devin Knight called: SSIS: Problem, Design, Solution. They are amazing presenters to watch, and I enjoyed the content I learned.

That evening I ran around with my head cut off trying to say hi (and hug) everyone I knew. I’m pretty sure I was still saying hi (and still hugging) people on Thursday that I hadn’t seen in a year. In fact, there are a couple of people I missed completely. That’s not too surprising, there were only 5000 people wandering around.

IMG_3221It was really fun getting to meet some of my Twitter friends whom I hadn’t met in person yet, like Andre Ranieri, Anil Mahadev, Annette Allen, and Adam Machanic. Now I have voices to go with the faces. (hhmmmMy data is skewed. All the people I listed have a first name that starts with “A”.)

Woot! First Day of the Summit

I started the day off at the Bloggers Table at the first Keynote. This was very exciting. You see, I was invited to live blog the keynote. I’m no Brent Ozar when it comes to live blogging, but I had a great time and I had at least one reader following the live blog.

After Julie and I had breakfast, I prepped for my very first Summit session, which was scheduled for Thursday. I’m so glad that I prepped mid Wednesday, because I was able to enjoy the rest of the afternoon and evening.

Which brings me to the last session of the day and the first session that I was able to attend. I took Mladen Prajdic’s  class called: SQL Server and Application Security for Developers. Anyone who has to write any inline SQL, should take his class. It was great.

I’ll just tell you right now. I networked EVERY single night. Why? Well, I got laid off right before I left for Summit. If you are going to get laid off, do it right before Summit, because it is the absolute BEST place to get the word out. I came home with several potential opportunities. I even gave up taking classes so that I could make deeper connections in the SQL community.

JessAndMickeyTHE BIG DAY

I was bummed to miss the Keynote on Thursday, but I don’t sit still well before a big presentation. I thought it was more important to have a good breakfast.

Since I was up first for the day, I went to my classroom forty minutes earlyand there were people already there! It gets better! I asked the room proctor if she could tell me how many people attended my session after it ended. She thought I was someone else, and started telling me about how many attendees they expected for my session. She told me that there were 232 people who added my session to their schedule and that I was on the “overflow watch list”. (Yeah, that made me a tad nervous.) That is a big class! Do you want to hear the best part??? I ended up with THREE HUNDRED AND SIXTY ONE attendees in my session. That’s 35% more than expected. There were only 12 empty seats. That’s more people than the number of students in my youngest daughter’s elementary school! Did I mention this was the FIRST time I was speaking at PASS Summit?

While those numbers still blow my mind, it wasn’t my favorite part of my session. Not even close. There were two other much more exciting events that happened in my session. The first, was the attendees themselves. They were engaged. They were so engaging that I ran a bit over. I felt bad about that, but my class was so interactive and I love that. The second happened during the six minutes prior to the session starting.

BenAndMickeyI was completely ready to start and didn’t know what to do with myself. There was a very low murmur in the room, but I could see many people just sitting there waiting for the session to start. So I turned my mic on and announced, “My favorite thing about Summit is the Networking. I want everyone of you to turn to the person on your left and the person on your right and introduce yourself. I will then know you have networked with at least two people today.”and the room exploded in conversation. I even introduced myself to two people in the front row. I then waited for my session to start with a huge grin on my face. I even had to quiet the room down when I started. Man that was awesome.

My Thursday is not done yet

I didn’t get to attend any sessions this day, because I was busy doing other things. I attended the Women in Technology lunch. I hung out in the community zone for an hour. And I was asked to live blog a Q & A session with the Executive Board. This was a way for the bloggers to ask questions of the Executive Board and blog about them before the Q & A session that was held on Friday. I was honored to be asked to participate. I really have a lot of respect for our Executive Board. It’s not easy pleasing the entire world of SQL professionals.

That evening I was invited to a special dinner put on by Red Gate for their Friends of Red Gate members. I’m mentioning it here, because of where we had dinner. We ate at a special restaurant called FareStart. This amazing restaurant helps the homeless get back on their feet by training them, housing them, and feeding them. All proceeds from the restaurant go back into the program. The chef and main waiting staff are all volunteers and there is even a waiting list to work there. I encourage you to follow the link and read about the restaurant. (This restaurant was also great about my food allergies. I had an AMAZING meal.)

ChrisJulieAndMickeyAnother great thing happened at the Friends of Red Gate dinner. I was able to introduce Chris Yates to Julie Koesmarano. The three of us, plus Jeffrey Verheul from The Netherlands all blog together under the hashtag, #SQLCoop. This was the first time that Chris and Julie had met in person. It’s the little things, like introductions, that make me happy.

And you know what I did after dinner. I networked… and celebrated this amazing day.

Friday already?

I actually was able to attend classes all day on Friday. I was quite happy about that. I was even able to attend Martin Cairney’s session called:Thinking Out of the Box: Manage SQL Server Using Built-in Tools . He had the very last session of summit and it was wonderful.

At lunch I lead one of the Birds of a Feather lunch tables in the discussion of Data Models. Our table was pretty full and we had a great discussion about someone’s data model challenges.

2014-11-07 21.39.38

Summit may officially end at 5:15 PM on Friday, but the “Summit 2014 Experience” doesn’t end until you’re buckled into the seat of an airplane, train, or car. (Hopefully, not a straight jacket.)

Two of my Aussie “flat mates” and I snuck off to a wonderful Lebanese restaurant for dinner. The food was great, but I could have done without the steep hills to get there. I guess it is a requirement to walk up and back down at least one steep hill while you are in Seattle. After dinner we went to a birthday party and finally ended up at… the Tap House where I continued to network.

Bittersweet

My “Summit 2014 Experience” started with the Aussies and ended with the Aussies. We all went out to breakfast one last time at the Daily Grill. It was a great breakfast, but still sad that I probably wouldn’t see these wonderful people for another year.

 

IMG_3270-003

Thanks for all the fish

I wish I could give a shout out to all the SQL Family members I spent time with, but there are far too many of you. I do want to thank everyone for being part of a fabulous community. It is a rarity in the computer programming world and we are all very fortunate to be part of it.

Questions Answered From Presentation: Changing Your Habits to Improve the Performance of Your T-SQL

QuestionMark_127880048This past Tuesday, Oct 8th, 2014, I had the privilege of speaking for the DBA Fundamentals Virtual PASS Chapter. It was my biggest audience to date and the fourth biggest audience for the user group to date. There were 374 people listening in, and they weren’t all from the US. I was thrilled to see that I had at least one person listening in from the UK.

Since there were so many people on line, I couldn’t answer all the question…thus my post today. You can download the slide deck, demos, and my sample database. (All my presentations eventually find there way to my resources page, here.)

I will post the link to the presentation on this page when it becomes available.

Questions Answered

1.    What is the name of the Red Gate tool for source control you mentioned?

Red Gate’s tool is called SQL Source Control. It’s like a bridge between your source control repository and SSMS or Visual Studio. They have a trial version that you can download here

2.    What to do when you come into a team and there was no SQL coding standards before and we do not want to spend time re-writing everything?

I would first get buy in on why standards are important to have and outline the standards you want to use. Then I would slowly change the code. What I mean by this is, update the code as you make changes to it. For instance, you need to modify stored procedure XYZ, I would add the standards to only that stored procedure or even only the part of the stored procedure that you change.

3.    What version of SSMS are you using?

In the presentation I was using SQL 2012, but all the concepts I covered applied to SQL 2005, 2008 and 2008 R2. Even the Template Browser and Template Parameters I covered are available in those versions (Talk about a well kept secret!). Note: If you need the sample database in SQL 2008 or SQL 2008R2, please let me know and I’ll see what I can do.

4.    Can you give an example of commenting within a stored procedure?

Sure. You can comment using two different ways. You can comment at the end of a line using two dashes. You can also comment any where in the code using /* and */ to encompass the comment. This allows for comments to spend multiple lines.

When I’m adding comments about my changes, I will include the date, my name and brief comment. If I’m lucky to have a work item or bug tracker application to keep track of my work, then I’ll reference that number as well. Here is an example.


USE AWMonkey
GO
/* ***********************************************************************************
Purpose:    Returns list of Territory Groups.
Notes:
Sample:
EXEC List.GetTerritory
Author:     THEZOO\mstuewe
Date:       9/16/2014

Revision History
(Change Date)    (Author)        (Description of Change)
-----------------------------------------------------------------
10/10/2014        Mickey Stuewe    #2345 Added All to the list of values as requested
By end user.
************************************************************************************ */
ALTER PROCEDURE List.GetTerritory
AS
BEGIN TRY

SET NOCOUNT ON;

SELECT
st.TerritoryID
,[Group] AS CountryRegion
,st.Name AS Territory
FROM
AdventureWorks2008R2.Sales.SalesTerritory AS st
UNION ALL
SELECT
-1
,' All'
,' All'
ORDER BY
CountryRegion; --Added the ALL to the list.
END TRY
BEGIN CATCH

DECLARE
@ErrorMessage AS nvarchar(3000)
,@ErrorSeverity AS int;

SET @ErrorMessage = ISNULL(DB_NAME(DB_ID()) + N'.' + SCHEMA_NAME(SCHEMA_ID()) + N'.'

+ OBJECT_NAME(@@PROCID, DB_ID()), N'SQL Object Name Not Available')

+ N': Error: ' + CONVERT(nvarchar(10), ERROR_NUMBER()) + N' Line: ' + CONVERT(nvarchar(5), ERROR_LINE()) + N' - ' + ERROR_MESSAGE();

SET @ErrorSeverity = ERROR_SEVERITY();
RAISERROR(@ErrorMessage, @ErrorSeverity, 1);
END CATCH;

SET NOCOUNT OFF;

5.    Is there any configuration for the Template Browser? Can it integrate with any source control?

No configuration for the Template Browser is needed. Here are the generic steps to integrate your Template Browser folder with your source control repository. Each Source Control repository will be different, but these are the general steps.

1. Open the location of the Template Browser. Since I’m using SQL 2012, I found mine here:

C:\Users\<Your User Name>\AppData\Roaming\Microsoft\SQL Server Management Studio\11.0\Templates\Sql\<My Templates>

2. Import your templates into your source control repository.

3. Set your templates folder as the “working folder” location.

4. Test by getting the latest from your repository.

5. Connect the template browsers on your team’s computers the same way.

6. Let them know when you make changes to the templates so that they can do a “get latest” to download the changes to there computers.

6.    Is there any kind of Freeware repository tool?

Yes, Subversion is an open source product and widely used repository tool. It has two parts. The first part is the repository which resides in a common location, usually on the network so that everyone can use it. Then each user needs to have a subversion client tool to connect to the repository. There are many different clients to pick from. I use Tortoise. Why? It was suggested to me and recommended by Red Gate. Since I was going to use SQL Source Control by Red Gate to connect to subversion, I decided to stick with there recommendation.

7.    So, Row_Number() is non-deterministic?

Yes, that is correct.

8.    Is there any difference or improvements in using a CAST vs CONVERT or vice versa?

There isn’t a different in using the two functions, in terms of performance. As Jeff Joy kindly pointed out, the difference between the two comes down to compliance.

Jeff Joy – CAST is ANSI – compliant while CONVERT is Microsoft-specific casting function that builds in additional formatting functionality. If you may ever need to convert to another platform you need to write code that is cross-platform compliant, use CAST.

The downside of writing all of your code to ANSI compliant standards, is you miss out on quite a few rich features that Microsoft has added to the SQL language. These rich features will help SQL statements perform faster or make your life easier. 

9.    Isn’t CTEs much better than subqueries? This is in regards to readability?

Yes CTEs (Common Table Expressions) are much easier to read. There are even patterns that can be accomplished with CTEs that can’t be accomplished with sub queries, but CTEs aren’t always faster. I have had a few queries I’ve had to re-write with sub queries because I needed every millisecond of performance.

10.    Can I see the trace options selected?

Yes, here are the events I captured. I also isolated the database, to the demo database.

 

QuestionsAnswered20141010_Image1

11.    Is there a standards document about T-SQL Formatting?

I don’t have one I can give you. I can create a template for you if you would like. Just send me an email.

12.    Can you e-mail that URL to us?

I’m not sure which URLs this is referring to, so I’ve listed the URLs that were mentioned in my presentation.

13.    Is there better performance between CTE and Apply?

Comparing CTE and APPLY is like comparing apples and oranges. They are two different things. A CTE provides you the ability to create a query to be used within another query. APPLY is an operator like INNER JOIN, but it is specifically used to join table functions (in line and multi line) as well as sub queries and CTEs.

14.    On the comparison of temporary, table variables, and materialized tables…Why was the second execution plan eligible for auto-parameterization and the others were not?

This will take a little bit of investigation since I don’t know off the top of my head. I will post this one in the next week or so.

15.    What is the proper way to use DISTINCT with lots of columns in the SELECT without duplicate records? (Per our SDLC, avoiding using DISTINCT with lots of columns.)

The DISTINCT operator causes a SORT in the execution plan which is usually a very costly operator. Unfortunately sometimes you can’t avoid it. One thing that you can do, is see if there are other ways to filter the data so that there aren’t duplicate records. For example maybe there is a column that marks the latest record as active. You could filter on that field to provide uniqueness across the rest of the fields. Another example would be joining to another table that will help reduce the records, but not alter the data you need to return.

IMHO, your SDLC is trying to avoid unnecessary DISTINCT operators in the code. I had added that same requirements to my SDLC document when DISTINCT was being abused. I later added the comment, “If you can prove the need for the DISTINCT operator, then the use of DISINCT will be approved.”

16.    About predicate and especially LIKE, we know that to put predicate more efficient, we shouldn’t start with a wildcard, but if we need to return data that end by defining string, it exists a way to improve that?

See # 17 below.

17.    REVERSE string?

I wanted to answer 16 and 17 together. #18 is related, but I answered it separately. While all the string functions are deterministic, the particular pattern that needs to be implemented (looking at the end of string), does not allow it’s self to an efficient execution plan. This has to do with how indexes are built. The keys in the index are stored sorted. This means that finding records can only be efficient when looking from left to right, not right to left when comparing strings. You can see this by running the following sample on my demo database I provided.

I created an index on the field we are looking at, then I tried various techniques to get the last three characters. All three execution plans show an Index Scan.


USE DemoProgramming
GO

IF  EXISTS (SELECT * FROM sys.indexes WHERE object_id = OBJECT_ID(N'dbo.TestData') AND name = N'IX_TestData_SomeDateText')
DROP INDEX IX_TestData_SomeDateText ON dbo.TestData WITH ( ONLINE = OFF )
GO
CREATE NONCLUSTERED INDEX IX_TestData_SomeDateText ON dbo.TestData
(
SomeDateText ASC
)

SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
td.SomeDateText LIKE '%-09'

SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
REVERSE(SomeDateText) LIKE '90-%'

SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
LEFT(REVERSE(SomeDateText),3) = '90-'

When you run this sample, you do get an Index seek.


SELECT
SomeDateText
FROM
dbo.TestData AS td
WHERE
SomeDateText LIKE '2004%'

With all that said, if a particular column is constantly being split into separate parts to be searched, then I would consider storing those separate parts in fields in the database. That way they can be indexed and improve performance.

18.    Full Text index vs Like?

I have not had the opportunity to use or even investigate Full Text indexes, so I can’t comment on this at this time. (sorry)

19.    Is it ok to manually create tables in the temp DB? Would that be the same as creating materialized tables, but will be wiped out after a DB server restart?

Temp tables and global temp tables are tables with in the TempDB. As far as creating materialized or permanent tables within in TempDB, that is not a good idea. TempDB can already be burdened by all the other queries that use temp tables, table variables, and worktables. It would be better to create materialized tables outside of the TempDB.

20.    For the Template explorer…how do you share your templates to other users in SQL because when we create templates they seem to only be available to the user specifically that created them?

See Question #1.

21.    What’s the best hosting solution for MSSQL server standard/enterprise?

This is outside the scope of this session and an area I don’t have a lot of experience with.

22.    Does including Actual Execution plan also puts an overhead on the server?

No. The Actual Execution plan is what is created when a query is ran. The query itself can cause overhead, but not the plan. In fact, the engine allows a specific amount of time to find the best plan. If it can’t go through all the different possibilities, then it will do the best it can with the possibilities it ran through . You can see this in the execution plan. Click on the SELECT icon and look at the properties. For this query, it told me it found a “good enough plan”.

 

QuestionsAnswered20141010_Image2

 

If you want to get the last execution plan that was generated for a query, then you can get it using sys.dm_exec_query_plan. See how to use this DMV here.

Last but not least…

23.    What’s your favorite whisky?

Well in the immortal words of Grant Fritchey…that depends. It depends on where I am, what I’m eating, and how many people I’m with. (I know, that is a weird one. But I find I can handle the smokier Scotchs when I’m around a group a friends, but not when I’m home relaxing.) Also, I drink all my whiskey neat.

So, since I don’t have one favorite, I’ll answer the question by country in the order of my preference.

  • Ireland – I don’t think I’ve met an Irish whisky I didn’t like, but my favorite (right now) is Redbreast.
  • USA – I love 100% Rye whiskey! Mitcher’s Rye and Highwest Rendevous Rye are at the top of my list.
  • Scotland – I like the Fruity and Spicy Scotch’s from Speyside, with Oban being at the top of list.
But wait there is more

Mark Finch mentioned a free formatting tool. I wanted to include it in my summary since I had never heard of it. You can find it at poorsql.com.

Thanks for all the fish

I want to thank everyone for attending my presentation, as well as for all the encouraging feed back I received.

Going Beyond The INSERT Statement

The Seventh Mission

In this installment of SQLCoOp, we are writing about CRUD. No, this is not about the stuff you scrape off the bottom of your shoes after hiking. This is Create, Read, Update, and Delete. I’ll be focusing on create, which is done with the INSERT Statement, but I’m going to go beyond the basic INSERT statement.

MCJJ_ALL

Multiple Rows

Let’s start with a new feature of the basic INSERTVALUES statement that we all know and love. In SQL Server 2012, Microsoft improved the INSERTVALUES statement, by allowing you to include multiple lists of values in the same query. This allows you to insert multiple rows of data with one INSERT statement. I find this syntax to to be a wonderful feature when I’m creating small sets of data using Excel.

Here is an example of what the syntax looks like. The values for each row are enclosed in parenthesis and separated by commas. In this example, I’m entering student scores for tests the students took at the end of September. After this one query runs, there will be 12 rows in the table.


CREATE TABLE StudentScore
(
     StudentScoresID int IDENTITY PRIMARY KEY
     ,StudentID int
     ,TestingRoomID int
     ,Score tinyint
     ,TestDateTime datetime
);

 

INSERT INTO StudentScore
     (StudentID,TestingRoomID, Score, TestDateTime)
VALUES
     (35, 1, 110, '2014-09-30 17:00:00.00')
     ,(36, 1, 87, '2014-09-30 17:00:00.00')
     ,(42, 1, 94, '2014-09-30 17:00:00.00')
     ,(10, 1, 99, '2014-09-29 12:00:00.00')
     ,(35, 10, 90, '2014-09-29 17:00:00.00')
     ,(36, 10, 100, '2014-09-29 17:00:00.00')
     ,(42, 10, 105, '2014-09-29 17:00:00.00')
     ,(10, 10, 99, '2014-09-29 17:00:00.00')
     ,(35, 5, 115, '2014-09-28 17:00:00.00')
     ,(36, 5, 70, '2014-09-28 17:00:00.00')
     ,(42, 5, 90, '2014-09-28 17:00:00.00')
     ,(10, 5, 67, '2014-09-28 17:00:00.00');
Common Table Expressions (CTEs)

I love CTEs, sometimes a little too much. The reason I think they are so great is two fold. First, they can make complicated queries easier to read and maintain. Second, they can perform complicated functionality such as recursion without breaking a query up into multiple statements. An added benefit, is they can be used with all CRUD statements, including the INSERT statement.

Let’s take a look at an example.

I have a group of students that were given the same test multiple times in different classrooms on different dates. I want to only insert the row representing  the highest score a student had. I also want to make sure that I keep the first time they received this score and any other data that came with that record.

Below I’ve highlighted the rows that I want to insert into the new table.

SQLCoOp7_Image1

The first step is building the SELECT statement for the CTE that numbers the scores for each student. This is done by using the ROW_NUMBER function in conjunction with the OVER clause. The OVER clause will have two parts. The first part will use the PARTION BY statement. This is used to restart the numbering for each partition. In this case we will partition the data by StudentID. The second part will use the ORDER BY statement. This tells us how to order the data so that it will consistently come out in the same row number. Since we want the highest score first, we will order the data by Score in a descending order. Then we will order the duplicates by TestDateTime so that the first date the score was achieved will appear first in the order.

SELECT
     ss.StudentScoresID
     ,ss.StudentID
     ,ss.TestingRoomID
     ,ss.Score
     ,ss.TestDateTime
     ,ROW_NUMBER() OVER (PARTITION BY ss.StudentID
               ORDER BY ss.Score DESC, ss.TestDateTime) AS RowID
FROM
     dbo.StudentScore AS ss

This SQL statement above will be used in our CTE. We’ll then use all the rows that have a RowID of 1 for the INSERT. The final INSERT statement will look like this.

CREATE TABLE dbo.StudentScoreHigh
(
     StudentScoresID int PRIMARY KEY
     ,StudentID int
     ,TestingRoomID int
     ,Score tinyint
     ,TestDateTime datetime
);

 

WITH cte_Scores
AS
(
     SELECT
          ss.StudentScoresID
          ,ss.StudentID
          ,ss.TestingRoomID
          ,ss.Score
          ,ss.TestDateTime
          ,ROW_NUMBER() OVER (PARTITION BY ss.StudentID
               ORDER BY ss.Score DESC, ss.TestDateTime) AS RowID
     FROM
          dbo.StudentScore AS ss
)
INSERT INTO dbo.StudentScoreHigh
(
     StudentScoresID
     ,StudentID
     ,TestingRoomID
     ,Score
     ,TestDateTime
)
SELECT
     StudentScoresID
     ,StudentID
     ,TestingRoomID
     ,Score
     ,TestDateTime
FROM
     cte_Scores
WHERE
     RowID = 1;
MERGE

A different way to approach this same problem, would be to use the MERGE statement. The MERGE statement, is like CRUD on steroids. It will look at all the data you specify that is coming in and compare to all the existing data. Then you can determine what will happen. Do you want to insert, update, delete, or ignore the data based on what does and does not match.

For this example, we’ll assume we are constantly updating the StudentScoreHigh table with the students highest score. We’ll compare what is already in the StudentScoreHigh table (the target) with what is in the StudentScore table (the Source) each night. If there is a record already in the table and the student achieved a higher score today, than the existing record will be updated. If the student does not exist in the table, then the student will be inserted into the table. Note: for this example, we’ll assume that the student could only take the test once in a day.

The MERGE statement will look like this.


MERGE dbo.StudentScoreHigh AS target
USING
(
     SELECT
          StudentScoresID
          ,StudentID
          ,TestingRoomID
          ,Score
          ,TestDateTime
     FROM
          dbo.StudentScore AS ss
     WHERE
          TestDateTime >= DATEADD(d, -1, GETDATE())
) AS Source ON Target.StudentID = Source.StudentID
WHEN MATCHED AND Source.Score > Target.Score THEN
UPDATE SET
     StudentScoresID = Source.StudentScoresID
     ,StudentID = Source.StudentID
     ,TestingRoomID = Source.TestingRoomID
     ,Score = Source.Score
     ,TestDateTime = Source.TestDateTime
WHEN NOT MATCHED THEN
INSERT
(
     StudentScoresID
     ,StudentID
     ,TestingRoomID
     ,Score
     ,TestDateTime
)
VALUES
(
     Source.StudentScoresID
     ,Source.StudentID
     ,Source.TestingRoomID
     ,Source.Score
     ,Source.TestDateTime
);
BONUS

If you have the latest Red Gate’s SQL Prompt installed (ver 6.4), then you will have an even easier time writing your favorite INSERT statement. They added a new feature that will highlight the field of the value you are modifying and vice versa.  Below you can see that my cursor is on line 61 and line 54 is highlighted. If I were to put my cursor on line 54, then line 61 would be highlighted.

 

SQLCoOp7_Image2

Don’t Stop Yet

Don’t forget to check out these blog posts by the rest of the SQL CoOp team on the subject of CRUD:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next time!!

On a SQL Collaboration Quest

Four SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. Along the way, they will also include other SQL professionals to join in the collaboration.

Original Post: On a SQL Collaboration Quest

Becoming a SQL Prompt Power User

MCJJ_ALLFour SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. This month we are cooperatively blogging about SQL Prompt by Red Gate.

The Next Mission

What makes up a SQL Prompt power user? Well, they use the SQL prompt commands like they were extensions of SSMS. They know the shortcuts, and they’ve memorized the key words to call up their favorite scripts. Let’s take a look what these short cuts are and how to integrate them seamlessly into your every day SQL Life.

Playing The Ebony Keys

A power user of any software is one who uses the hot keys in lieu of the menus. SQL prompt has a plethora of hot key combination.

Before I use <ctrl> + e to execute my code, I always use <ctrl> + k, <ctrl> + y. This hot key combo will structure your code based on the styles you have set in SQL prompt. This is my most heavily used hot key combo right after <ctrl> + e and <ctrl> + r (hiding the results window).

There is a new hot key combination that is coming out soon in version 6.4, and I’m very excited about it. It adds a semicolon at the end of each statement! I’ve been trying to add the habit of using semicolons for 2 years now, and it never sticks. Now I can use <ctrl> + b, <ctrl> + c and away I go.

Here is a table of the hot key combinations I use the most, plus the shortcut way I remember them.

Hot Key Combination Shortcut Hot Key Combo What It Does
<ctrl> + b,
<ctrl> + w
<ctrl> + bw Expands ALL of the wild cards.
<ctrl> + b,
<ctrl> + q
+ bq Qualifies all the SQL Objects and fixes the capitalization as well.
in beta <ctrl> + b,
<ctrl> + c
<ctrl> + bc Adds semicolons to the end of each statement. (Available soon in 6.4)
<ctrl> + k,
<ctrl> + y
<ctrl> + ky Standardizes the code based on the formatting styles that were set in SQL Prompt.

As a power user, my key strokes will look like this when I use them all at once:

<ctrl> + bw, bq, bc, ky and finally <ctrl> + e to execute my code. Here is an example of how an ugly duckling turns into a beautiful swan by applying all of these hot keys at one time.

Before


alter procedure [dbo].[salesteamassignment_rptugly](@group as varchar(1000), @startdate as date, @enddate as date)as
set nocount on
declare @uselessvar as varchar(max)
select
p.businessentityid ,firstname,lastname,s.name as store,[group] as TerritoryGroup,cr.name as CountryRegion,st.name as Territory,customercount,s.*
from
adventureworks2008r2.sales.salesperson as sp join adventureworks2008r2.person.person as p on sp.businessentityid = p.businessentityid
join adventureworks2008r2.sales.salesterritory as st on sp.territoryid = st.territoryid join [adventureworks2008r2].[person].[countryregion] as cr on st.countryregioncode = cr.countryregioncode
join adventureworks2008r2.sales.store as s on s.salespersonid = sp.businessentityid join dbo.fn_split(',', @group) as grps on st.[group] = convert(nvarchar(50), grps.separatedvalue)
outer apply (select count(*) as customercount from adventureworks2008r2.sales.customer as c where s.businessentityid = c.storeid ) as cc

After

ALTER PROCEDURE dbo.SalesTeamAssignment_rptUGLY
(
@group AS varchar(1000)
,@startdate AS date
,@enddate AS date
)
AS
SET NOCOUNT ON;
DECLARE @uselessvar AS varchar(MAX);
SELECT
p.BusinessEntityID
,p.FirstName
,p.LastName
,s.Name AS store
,st.[Group] AS TerritoryGroup
,cr.Name AS CountryRegion
,st.Name AS Territory
,cc.customercount
,s.BusinessEntityID
,s.Name
,s.SalesPersonID
,s.Demographics
,s.rowguid
,s.ModifiedDate
FROM
AdventureWorks2008R2.Sales.SalesPerson AS sp
JOIN AdventureWorks2008R2.Person.Person AS p ON sp.BusinessEntityID = p.BusinessEntityID
JOIN AdventureWorks2008R2.Sales.SalesTerritory AS st ON sp.TerritoryID = st.TerritoryID
JOIN AdventureWorks2008R2.Person.CountryRegion AS cr ON st.CountryRegionCode = cr.CountryRegionCode
JOIN AdventureWorks2008R2.Sales.Store AS s ON s.SalesPersonID = sp.BusinessEntityID
JOIN dbo.fn_Split(',', @group) AS grps ON st.[Group] = CONVERT(nvarchar(50), grps.SeparatedValue)
OUTER APPLY (
SELECT
COUNT(*) AS customercount
FROM
AdventureWorks2008R2.Sales.Customer AS c
WHERE
s.BusinessEntityID = c.StoreID
) AS cc;

But Wait, There’s More

Formatting your code is only half of the features in SQL Prompt. There are three other features that every power user should use. (Ok, the truth be told, there are more than 3, but my article shouldn’t be the length of a book.)

Feature 1:

SQL Prompt offers the ability to save snippets of code. This is really useful for repetitive tasks. Here is a list of articles that I’ve written talking about specific snippets that I find handy.

Placeholders are one of the most helpful features that SQL Prompt provides for you to work with snippets. The placeholders are reserved words that are variables for specific pieces of information or represent actions that can be applied to your snippet. They are always completely capitalized and are surrounded by dollar signs ($). You can find the current list for SQL Prompt 6 here.

One of the new placeholders is prefect for the power user. It allows you to insert a snippet and have part of it highlighted. Here is an example of how I like to use it. This particular snippen, which I named “st1”, will add SELECT TOP 100 * FROM in front of the table name that I”ve highlighted and copied to the copy buffer.

$SELECTIONSTART$SELECT TOP 100 *
FROM $PASTE$$SELECTIONEND$

The $PASTE$ placeholder will be replaced with whatever is in my copy buffer. The $SELECTIONSTART$ and $SELECTIONEND$ placeholders will cause the code within the placeholders to be highlighted.

To use it, I follow these steps.

1. I highlight a table I want to use and hit <ctrl> + c to get the table  into my copy buffer.

SQLCoop5_image1

 

2. I type st1 + to get the snippet. I now have a highlighted SELECT statement ready to execute, so I hit + e to execute it.

SQLCoop5_Image2

 

3. Finally, I type <ctrl> + z twice to undo the changes I’ve made.

SQLCoop5_image1

 

This allows me to quickly see some sample data from the table without ever leaving my current query window.

Feature 2:

This is a hidden gem. This feature finds all the invalid objects in your database. This is a great feature for those of you who need to perform code reviews and want to make sure that any DDL changes have not broken any existing SQL Objects.

In this example I have a table with a trigger that uses the field ModifiedDTS. I deleted the field and ran the command. It listed all the SQL Objects that are now broken due to the deleted field.

SQLCoop5_Image3

This feature does not have a Hot Key, but it is definitely worthy of a power user.

Feature 3:

This is a great “clean up” feature, especially when you are working with stored procedures with 200 + lines that you have inherited. You know the ones, they are filled with old code and have never been cleaned up. Well, this feature is perfect for it, because it finds all the unused variables and parameters. Simply use the hot key combination + b, + f. SQL Prompt, will then underline in green all the unused variables and parameters for you to evaluate for usefulness.

SQLCoop5_Image4

 

I encourage you to spend 10 minutes going through the SQL Prompt menu to find out what other goodies exist for you to leverage. There are also new features coming soon in version 6.4. Here is a great reference to all the release notes for each version, plus the release notes for the current beta version.

Don’t Stop Yet

If you want to read more about how to work with SQL Prompt, then check out these blog posts:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next month!!

Original Post: On a SQL Collaboration Quest

%d bloggers like this: