T-SQL Tuesday #72 Invitation – Data Modeling Gone Wrong

This month marks the 72nd T-SQL Tuesday.  Adam Machanic’s (b|t) started the T-SQL Tuesday blog party in December of 2009. Each month an invitation is sent out on the first Tuesday of the month, inviting bloggers to participate in a common topic. On the second Tuesday of the month all the bloggers post their contribution to the event for everyone to read. The host sums up all the participant’s entries at the end of the week. This month I’m the host and the topic is …

Data Modeling Gone Wrong

The purpose of SQL Server, is to make sure that the databases are kept safe and run as optimally as possible. The problem is, if the data model is flawed, or not maintained, then no matter how optimally the SQL Server is configured, the database won’t be able to function efficiently.

I would like to invite you to share some data modeling practices that should be avoided, and how to fix them when they do occur.

Rules for T-SQL Tuesday Blog Party

Rule 1: Make sure that you include the T-SQL Tuesday image at the top of the post which will help identify your post as a T-SQL Tuesday blog post.  Please include a link back to this invitation too.

Rule 2: Publish your post sometime next Tuesday using GMT. Here’s a link to a GMT time convertor.  For example, in California, that would cover 5 pm Monday to 5 pm (PDT) Tuesday.

Rule 3: Come back here and post a link in the comments so that I can find all the posts for the round up.

Rule 4: Don’t get yourself fired. Make sure that you either generalize your post or get permission to blog about anything from work.

Rule 5: If you roam the Twitterverse, then don’t forget to Tweet about your blog post with the hashtag #tsql2sday.

Rule 6: Go read someone else’s blog on the subject!

Final Rule: Have fun!

Live Blogging PASS Summit Key Note 2015

Top 10 Ways To Create Your Meet and Greet List For Summit?

There was a time not to long ago when I didn’t know how to meet others in my profession. I didn’t know about SQL Saturdays or PASS Summit. The conferences that I was starting to attend were full of people who really didn’t want to network. But I’m persistent. I was determined to find a community of professionals who wanted to network. Then I found my first SQL Saturday and I fell in love with the community that PASS helps create the world over. I now have friends on almost every major continent? (Are there any SQL professionals in Antarctica? If so, I want to meet you.)

Who you gonna meet?

As each PASS Summit approaches, I make a list. I check it twice. And I decide who I’m going to meet. My question to you is, who are you going to meet at PASS Summit (or at your next SQL Event)? How do you decide who you want to meet? If you are looking for ideas on how to make your list, and you should have a list, keep reading.

My Top Ten List of How I Pick People to Meet

 

1. Set up a Twitter account to get to know the #SQLFamily community

The first thing I did before my very first Summit was creating a Twitter account. The SQL community has a huge presence there and it is a great place to get to know people from all over the world. You’ll not only connect with other individuals, but you’ll also see tweets of links to great articles that people share. You’ll also have a place to ask others how they solved the problems you are now facing through the hashtag, #SQLHelp.

I had several people on my first “Meet and Greet” list whom I had met this way. One of them was Ed Watson, whom I’m still friends with.

IMG_3221We chatted on twitter often. It was great meeting Anil in person.

Note: I recommend reading this before acquiring your first Twitter account. http://www.brentozar.com/twitter/book/

2. Consider the bloggers you follow

I checked out the list of blogs that I read and compared the authors to the list of attendees to see if any of the bloggers I knew were attending.

1651Ola Hallengran is known for his maintenance scripts.
We connected at a karaoke bar.

 

3. Ask the people in your local community if they are attending

They will be able to introduce you to other people during the event. I met several people at the SQL Saturday in San Diego who were also going to Summit. They were happy to introduce me to people at the various events we attended.

723I know Phil from the local user groups.
He helped introduce me to other people.

4. Consider the speakers of the sessions you are attending

As you determine which of the sessions you want to attend, read up on the instructors. They all have small bios on the PASS Summit site. You can also check out their blogs. If they have something in common with you, or if they really helped shape your career, then add them to your list. Just don’t make your entire list out of the speakers. You need variety.

IMG_3245

Jes Borland is an amazing speaker.
I’m so happy I’ve gotten to know her.

Note: I would recommend introducing yourself to speakers you want to meet at various after parties, during lunch, or as you see them in the halls. They are usually super busy right before their sessions setting up and right after their sessions answering questions.
But wait! There’s more!

Those are the easy ways to create a list before the event. But don’t stop building your list after the event starts. The list you bring with you is just the beginning. Keep reading to find out how to add to your list during the event.

5. Go to the networking parties in the evenings

At PASS Summit, there is a Networking Party put on by Andy Warren and Steve Jones. GO TO IT. Sit with people you don’t know. I know I will be. This event is not a sponsored event. In other words, you need to pay for your food and drink, BUT the networking is free and encouraged. Register for it here.

When I went to my first one, I met the lovely Viki Harp. She introduced me to Wendy Pastrick who whisked me away to meet Pam Shaw. It was actually amazing that I ate any food at all. It was so much fun meeting new people.

MickeyAndJasonJason and I are connecting at the
Summit Networking event.

6. Sit with new people during breakfast and lunch

At my first Summit, I only knew the people I had met at my first SQL Saturday, and I was very determined not to eat a single meal by myself. So I didn’t. Every morning, I got on Twitter and asked if I could join anyone for breakfast at the Daily Grill. I used the hashtag #Summit2012 (this year it will be #Summit2015, obviously). And guess what. I never ate alone. This wonderful woman, Monica Rathburn asked me to join her almost every morning.

1143This was my last breakfast at my first Summit.
We started with four people.

7. Consider people in your sessions

Introduce yourself to people sitting around you before the session starts. Or strike up a conversation about the session with someone after the session has ended.

722Ritu and I connected when we realized
we kept attending the same sessions.

8. Hang out at the Community Zone at PASS Summit

This is a great place to meet people. Why? That is the purpose of the Community Zone. Usually there is a schedule for various groups of people to be in the community. So, if you really want to meet the Australians, then show up during the hour to hang out there. If you want to meet people from your own region, then come when they are scheduled to meet up in the Community Zone. (The best part is there are awesome bean bag chairs to sit in.)

1648Tjay and I ran into each other in
the Community Zone.

9. Attend the after parties

Attend as many after parties and other non SQL events as you can. Yes, quite a few of them have drinking, and that might be an issue for you, but not all of them do.  Here are some other events that have little to no drinking that are usually found at PASS Summit:

  • Running. That’s right, there is a large group of runners who get up when I’m still dreaming and go for a run. They usually have cool SQL shirts and Jes Borland is usually found leading the pack of SQL runners.
  • Board game night. Last year there were a couple of nights where people gathered around board games to talk and have fun.
  • PASS Prayers. This is a Christian group who meets for prayers and fellowship in one of the hotel lobbies in the morning, again when I’m still dreaming.
  • Photo walk: This is a great way to learn about Seattle, get a good walk outside, and get to know other SQL Photography lovers.

2014-11-07 21.39.38We were hanging out at an event in the evening.

Note: All the events that PASS knows about will be put on this page a few weeks before PASS Summit starts.

10. Attend other events that occur around PASS Summit

Last but not least, attend Redgate’s SQL in the City event on the Monday before PASS Summit. This is an amazing free event put on by Redgate. They have several speakers speaking on various topics. There’s also a free lunch and networking at the end of the event. When you are done, you can head over to the Networking dinner I mentioned in No. 5 above.

downloadSebastian Mein and I had a photo op with
the lovely Carly from Redgate visiting from the UK.

Hi. My name is…

One of the things you can do when you are talking to people is give them your card. Wait…You don’t have one? That is easily fixed. Vista Print is where I make mine and they always seem to have discounts. Since the card is about YOU and not your company, just put the contact info you are interested in sharing. I put my name, title, email, and a picture of myself.  My first year, I came home with 50 cards from other people. I wrote on the back where I met them. The following year I went through them and I was amazed at how many I still remembered and even interacted with through other SQL events and through social media (mostly Twitter).

Back to talking to people

So, you’ve got your list and you are standing in front of someone you wanted to meet. Now what?

If they’re not considered “famous”, then ask them if you could buy them a drink (coffee, soda, or bottled water works, too) or ask if they have time to meet in the Community Zone to chat. The Community Zone is usually full of awesome bean bag chairs to sit and talk in.  Tell them you wanted to meet them to talk about xyz, and xyz doesn’t have to be about SQL. Maybe you both enjoy art, or learning about Whiskey distilleries. (Oh wait, that’s me.)

2014KY4ChrisYatesChris and I will be reconnecting over breakfast this year.

If you consider them “famous”, thank them for writing/speaking/inspiring. If they have time to talk, tell them about yourself and maybe ask them a question about SQL.

What if you are shy or an introverted?

You can still make connections. You only really need to make one strong connection. It’s ok if it takes more than one Summit to develop. I have SQL Family friends that are shy/introverted. I make sure they go out to some of the events and are having a good time. I help with making introductions for them to make connections with people on their “Meet and Greet” list.

Anecdote: My first Summit I met someone who was shy. We saw each other again our second Summit, but it wasn’t until our third Summit when we developed a stronger connection. I know it was hard for them, but they called me and asked if they could go with me to an event. They weren’t comfortable going by themselves. I was happy to go to the event together. I was also happy to introduce them to other people in the community. I’m really looking forward to spending more time with them this Summit.
Follow the White Rabbit

In the end, it’s all about making connections. If Neo hadn’t followed the white rabbit, he wouldn’t have met Trinity who took him to Morpheus. These connections are not just for the yearly PASS Summit. They are there for as long as you nurture them. Some of the people I’ve met, I only see at Summit, some I see four or five times a year, and some I talk to every day through social media and Skype. These connections help remind me that I’m not crazy when the App Devs tell me that Foreign Keys are bad, and they help me when the poo hits the fan and I need to restore data in a way I’ve never done before. Finally, they are there for me because we are all part of this huge community, affectionately dubbed “SQL Family”.

Questions Answered From Presentation: SSRS 101 Creating Reports for Diagnostic Data

On Tuesday, July 7th, 2015, I had the privilege of speaking for the DBA Fundamentals Virtual Chapter. Thanks to the 375 who attended and to all who will be watching the recording. While I can’t read the comments during my presentation (way too distracting), I did enjoy reading the questions and comments that were sent to me. Below you will find answers to the questions I was sent.

Questions Answered

1. Are you talking about SSRS 2016 or 2014?

I was presenting using SSRS 2012, but the demos and discussions applied to versions 2008 through 2014.

2. When should you use SSRS RDL vs. RDLC (Remote Definition Language Client-side)?  Should you be able to invoke RDL from a web app (say web forms, asp.net) just as easily as RDLC?

The difference is RDLC reports will run on the client side. They don’t need to access the report server. RDL reports can be invoked on the client side, but they are rendered on the report server then delivered to the client. The RDLC reports would be part of an application and will take up client resources to generate.

3a. What is the difference between SSRS and Reports Application in Visual Studio 2010 (full version)?

3b.Why use SSDT over Visual Studio?

I liked both of these questions, and they have the same answer, so I wanted to group them together.

SSRS (SQL Server Reporting Services) files are generated in two applications, Report Builder and Visual Studio. Report Builder exists on the Reporting Server called Report Manager. Visual Studio is the Integrated development environment (IDE) that you write code in. Microsoft created (at least) two distributables for Visual Studio. The one that Application Developers, is only known as Visual Studio and the one that comes with SQL Server called SQL Server Data Tools (SSDT). The IDE is the same for both of them. When they co-exist on the same computer, they will be integrated. If you only have the SSDT version, then you can only create SSRS, SSIS, and SSAS projects. You can also download over project types for Visual Studio, such as PowerShell projects.

Note: For SQL Server 2008 it was called Business Intelligence Development Studio (BIDS). It was renamed to SSDT in version 2012.

3a. Is it better to use SSDT or Report Builder? I have been using Report Builder for my reports so far…

The answer to this depends on your comfort level for creating reports. Report Builder is meant for the power users in the business. SSDT is meant for developers. I’ve personally never used report builder, but that’s because I’ve been writing reports since before Report Builder was introduced. If you continue to create reports in Report Builder, then take a look into generating Report Parts. Those were introduced in SQL 2012 and are supposed to be “building blocks” to help create more complicated reports within Report Builder.

4. If you don’t have the source of the report can you get it back from the web

I am happy to tell you, yes! (and there was much rejoicing!)

In the Report Manager, pull down the menu to the right of the report and select download. You can also go into the properties of the report and see the download option in the toolbar for that report.

Answers201507_01

5. How did you link the two datasets?

The two datasets weren’t linked like you would see writing a JOIN statement. They were filtered using the same parameter. Datasets can’t be joined together within the report. There are a couple of functions introduced in SQL 2008 R2 that allow you to reference a single value from another data set, but that is as close as they have gotten.

6. How is the security configured within the data sources ? How can the double-hop authentication issue be avoided when accessing data from multiple servers ?

You have 4 different security options that you can use. Books online can tell you the details on each of them.

Answers201507_02

I personally use a SQL Server login, only used for reports to access the data. I then use windows authentication to access the reports themselves. Each data source can connect to a different server and even use a different SQL Server account.

I’m not sure I’m answering your question adequately since it can be read a couple of different ways. If you are looking for a solution to a double-hop authentication problem, then take a look at this article. I’ve faced this once, but it was about 5 years ago. The Double-Hop Authentication Problem

Send me an email if you still have questions on this topic.

7. I saw that you saved the password for the data source (during development). How can we ensure that it is encrypted when deployed to the server?

I would have to do some research to see how the password is sent during the publishing event, but I do know it is stored encrypted.  In fact, the Connection String, UserName, and Password are all encrypted. Go into the ReportServer database and run this query: SELECT * FROM ReportServer.dbo.DataSource.

If you are still concerned about the Data Source during the publishing event, then you can create the data source manually in Report Manager. As long as the SSRS project has the property “OverwriteDataSources” set to false, the data source will remain the one you manually created.

8. What if you want to show multiple databases (in your report)?

You can create as many data sources as you need and they can each point to a different database on a different SQL Server.

If you are gathering diagnostic data, like we were doing in the demo, then you might want to consider a two step approach. The first step would be to have an SSIS package or a PowerShell script retrieve the diagnostic data from each server on a schedule and save the data to a central database. Then you can have the report pull the diagnostic data from that central database.

9. Can hidden parameters have values passed to them, for example: can a hidden parameter list containing state names have “CA” passed to it when the report is opened?

Yes. That would be done in the parameter properties on the report in the Report Manager, from code, or from another report. You can also create Linked Reports in the Report Manager and change the values of the hidden parameters.

10. Is there any difference between previewing the report than just running it by r-clicking on the report?

Not really.

11. Can you set border style in your template? So it’s not there at all?

Yes. If you were to save a new tablix in the template with the borders set to None for each row. Then you would copy that tablix for each tablix you need. Unfortunately you can’t modify the actual tools in the toolbox.

12. Any strategy to fine tune multiple drop down option/filter in SSRS report.

Yes.

1. Make sure that the Queries that are populating the drop down lists are FAST.

2. If the drop down lists are filtering each other, known as cascading parameters, then have them filter within the report, instead of making a round trip back to SQL Server for the filtering. This would be done by applying the filter to the 2nd parameter in the filter property instead of attaching it to the parameter property.

3. Make sure that the predicates of each of the queries and the final data set have proper indexes.

4. If the queries that the drop down lists are using are really slow due to too many JOINs, then consider using an SSIS package to create a static table that is updated every X minutes/hours with the latest data. Then use that table for the parameter lists.

13. This is not the question you are looking for. Move along. Move along.

14. How to display a message in the report if the dataset doesn’t return any records ?

This is an excellent question and the answer is not used enough.

Set the NoRowsMessage property on the tablix (Table/Matrix/List) control. You can get to it through the property window. The value of this property is displayed when there is no data to show. The value can even be an expression.

15. How to get a big report to limit to 4000 records but well distributed in terms of days in a month?

I would love to have more information on why you have this requirement. It is an interesting one. Here is what you can do. Note: It will slow the query down though, so make sure you have good indexes on the predicates.

1. I have 1 million rows of random dates and numbers. This happens to be a very narrow table.

Answers201507_03

2. I use the ROW_NUMBER() function. It will give a sequential number to each row based on the partition. I partitioned the data based on Year, Month, Day. If your date field does not have time, then you can partition based on the date field. Within each partition, the data is sorted by Date then TestDataID (the PK). This will help guarantee the same order each time.

3. I determined how many days were in the range of dates I’m selecting from, divided 4000 by that number, and select only those row number values per date. This will provide an even distribution across days.

4. Since you can’t guarantee that the number of days will divide evenly into 4000, you need to either have less than 4000 rows returned, or more than 4000 returned by adding 1 to the number created in step 3.

Answers201507_04


–1 million rows
DECLARE @StartDate AS DATE = ‘1/1/1972’;
DECLARE @EndDate AS DATE = ‘1/1/1973’;

SELECT DATEDIFF(DAY,@StartDate, @EndDate) AS NumberOfDays;
WITH cte_ranking
AS
(
SELECT TOP 100 PERCENT
TestDataID
,SomeDate
,SomeTextNumber AS SomeNumber
,ROW_NUMBER() OVER (PARTITION BY dateyear, datemonth, dateday ORDER BY datevalue, TestDataID) AS RN
FROM
DemoProgramming.dbo.TestData AS td
JOIN dbo.DimDate AS dd ON td.SomeDate = dd.DateValue
WHERE
td.SomeDate >= @StartDate
AND td.SomeDate <= @EndDate
ORDER BY datevalue
)
SELECT
cte.TestDataID
,cte.SomeDate
,cte.SomeNumber
FROM
cte_ranking AS cte
WHERE
cte.RN <= (4000/DATEDIFF(DAY,@StartDate, @EndDate)) + 1;

–Returned 4038 rows

[\SQL]

16. With subscriptions is it possible to make one subscription to use the current fiscal month to generate report vs. creating 12 subscriptions for each fiscal month?

Unfortunately no. You can do it based on calendar months, but not based on fiscal months where the first day of the fiscal month may not be the 1st day of the calendar month. I would love to see this feature.  If this is a must have requirement, then email me and we can talk about some “creative” solutions.

17. What are the different security roles on report server side?

The SSRS development team were kind enough to add the definitions on the security screen.

Answers201507_05

 

18. Can you append the date/time to the file name that gets created with a subscription that saves to a fileshare?

Yes. Add the @ExecutionTime parameter to the filename.

19. How do you set-up the email option for the subscription?

First your Report Manager needs to be setup to send email. I usually have my Sys Admin help me with that.

After that, it’s a matter of populating fields (and there was much rejoicing.)

If you use the standard subscription, then the setup screen will look like figure A below. If you are creating a data driven subscription, then the setup screen will look like figure B below. Either way, you need to set all the properties. Then select a schedule for the emails to go out on.

I would highly suggest using Active Directory groups for the To list. This make is easier to manage when people change jobs. You may also want the email to go to yourself for a few days or weeks to make sure it is going out as predicted.

figure A

Answers201507_06

figure B

Answers201507_07

 

Thanks for all the fish

I wanted to give a shout out to Glenn Berry for letting me use his diagnostic queries for my demos. You can find the full diagnostic script, per SQL Version here on Glenn’s website.

The downloads for this presentation are available on the DBA Fundamentals Meeting Archives page and will be available on my website under Resources shortly.

T-SQL Tuesday #66: Monitor ALL Reports

Catherine Wilhelmsen (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009.

This month’s invitation is about monitoring.

This topic is a tough one. I didn’t want to pick just one topic. Especially since I just installed SQL Sentry’s Performance Advisor in production and the Redgate’s DLM Dashboard in Dev. Both products are helping me see my environments clearer.

But what about the need to monitor other items that don’t have a cool tool you can buy off the shelf? Or products like SSRS that have built in metrics collection?

That is when you need to roll your own code to capture metrics to help monitor your own, unique environment.  Take Excel reports as an example, or really any application that is used for reporting. It would be wonderful if you could capture metrics and monitor usage of ALL reports across your environment in a consistent manner. Let’s take a look at a way that can be accomplished.

In the beginning

The first thing you need to do is some planning. You need to know what kind of data you want to capture, and how you are going to update the existing reports to capture that data.

I happen to really like the way that SSRS collects usage metrics. So, for my Excel reports, I wanted to collect similar data. This will allow me to have one report to show how all my reports are performing. Maybe my top 10 slowest reports are from 3 different reporting platforms.

I created a couple of tables to hold the non-SSRS report metrics and meta data. I then created a stored procedure to insert the data into the tables. Finally, I added the stored procedure to the bottom of the stored procedures used by the Excel reports I wanted to keep track of.

Step 1

I created a table that held the information I wanted to collect.


CREATE TABLE dbo.ReportType
(ReportTypeID INT IDENTITY PRIMARY KEY
,ReportTypeName AS varchar(20) NOT NULL
)

INSERT INTO dbo.ReportType(ReportTypeName) VALUES ('Excel');
INSERT INTO dbo.ReportType(ReportTypeName) VALUES ('PowerPoint');

CREATE TABLE dbo.ReportLog
(ReportLogID INT IDENTITY PRIMARY KEY
,ReportTypeID INT NOT NULL
,ReportPath VARCHAR(255) NOT NULL
,ReportName VARCHAR(255) NOT NULL
,UserName NVARCHAR(255) NOT NULL
,SprocName VARCHAR(255) NOT NULL
,ReportParameters VARCHAR(500) NULL
,TimeStart DATETIME NOT NULL
,TimeEnd DATETIME NOT NULL
,Row_Count INT NOT NULL
)

Step 2

I created a stored procedure that I could have at the bottom of the stored procedures the reports are using.


CREATE PROCEDURE dbo.InsertReportUsage
(
@ReportTypeID INT
,@ReportPath VARCHAR(255)
,@ReportName VARCHAR(255)
,@UserName NVARCHAR(255)
,@SprocName VARCHAR(255)
,@ReportParameters VARCHAR(500)
,@TimeStart DATETIME
,@TimeEnd DATETIME
,@Row_Count int
)
AS
BEGIN TRY
SET NOCOUNT ON

INSERT INTO ReportLog
(
ReportTypeID
,ReportPath
,ReportName
,UserName
,SprocName
,ReportParameters
,TimeStart
,TimeEnd
,Row_Count
)
VALUES
(
@ReportTypeID
,@ReportPath
,@ReportName
,@UserName
,@SprocName
,@ReportParameters
,@TimeStart
,@TimeEnd
,@Row_Count
)
END TRY
BEGIN CATCH

DECLARE
@ErrorMessage AS nvarchar(3000)
,@ErrorSeverity AS int

SET @ErrorMessage = ISNULL(DB_NAME(DB_ID()) + N'.' + SCHEMA_NAME(SCHEMA_ID()) + N'.' + OBJECT_NAME(@@PROCID, DB_ID()),
N'SQL Object Name Not Available') + N': Error: ' + CONVERT(nvarchar(10), ERROR_NUMBER()) + N' Line: '
+ CONVERT(nvarchar(5), ERROR_LINE()) + N' - ' + ERROR_MESSAGE()

SET @ErrorSeverity = ERROR_SEVERITY()
RAISERROR(@ErrorMessage, @ErrorSeverity, 1)
END CATCH

SET NOCOUNT OFF
GO

Step 3

I added some common fields at the top of the stored procedure for my report. Then, at the bottom I called the stored procedure that will record the report usage metrics.

Since I didn’t want to hard code the report path or the report name in the stored procedure, I passed those values from Excel.

I was also able to dynamically capture the stored procedure name, and the system user’s credentials.

Here is an example of a report that lists the names of employees.


CREATE PROCEDURE dbo.PersonReport
(
@LastName varchar(100)
,@ReportPath VARCHAR(255)
,@ReportName VARCHAR(255)
)
AS
BEGIN TRY
--SET NOCOUNT ON  (Don't set)

--------------Excel Log Block--------------
DECLARE    @SprocName VARCHAR(255) = DB_NAME(DB_ID()) + '.' + SCHEMA_NAME(SCHEMA_ID()) + '.' + OBJECT_NAME(@@PROCID, DB_ID())
DECLARE    @ReportParameters VARCHAR(8000) = 'LastName=' + @LastName
DECLARE    @TimeStart DATETIME = GETDATE()
--------------Excel Log Block--------------

SELECT
p.BusinessEntityID
,p.Title
,p.FirstName
,p.MiddleName
,p.LastName
FROM
Person.Person AS p
WHERE
p.LastName LIKE @LastName + '%';

--------------Excel Log Block--------------
EXEC ReportManagement.dbo.InsertReportUsage 1, @ReportPath, @ReportName, SUSER_NAME(), @SprocName
,@ReportParameters, @TimeStart, Get_date(), @@RowCount
--------------Excel Log Block--------------
END TRY
BEGIN CATCH

DECLARE
@ErrorMessage AS nvarchar(3000)
,@ErrorSeverity AS int

SET @ErrorMessage = ISNULL(DB_NAME(DB_ID()) + N'.' + SCHEMA_NAME(SCHEMA_ID()) + N'.' + OBJECT_NAME(@@PROCID, DB_ID()),
N'SQL Object Name Not Available') + N': Error: ' + CONVERT(nvarchar(10), ERROR_NUMBER()) + N' Line: '
+ CONVERT(nvarchar(5), ERROR_LINE()) + N' - ' + ERROR_MESSAGE()

SET @ErrorSeverity = ERROR_SEVERITY()
RAISERROR(@ErrorMessage, @ErrorSeverity, 1)
END CATCH

SET NOCOUNT OFF
GO

Last Step

Finally, I created a stored procedure that joined the metrics from the ExecutionLog3 View that is provided in the ReportServer database for SSRS report usage metrics with the metrics I captured in the ReportLog table.

Now I can generate an overarching report for all reports in my environment.

Thanks for all the fish

Thanks go out to Catherine Wilhelmsen for hosting this month’s T-SQL Tuesday blog party. She is one of my favorite people to catch up with during PASS Summit. I hope you spend some time catching up with her on her blog.

Exploring SQL Prompt 6.5

010The SQL Prompt Team has been working hard at the Redgate campus in England, and they have out done themselves yet again with their latest release, SQL Prompt 6.5. I’m extra excited about this release, due to the fact that several of the features I’ve been wanting for a while are available this time around. In this post, I’m going to go over my top three favorite new features.

Tab Coloring

I was so happy to see this feature move from the Experimental Lab List to the Feature List. It was like watching Pinocchio become a real boy. Not only does this feature behave the way it did when it was on the Experimental Lab List, but they added some additional cool features to it.

So what does it do for you? It colors the tabs of each query window based on environment settings that you set up. You’re probably thinking, “Yeah, Mickey, but SSMS already does that in the status bar for me.” Well, the Tab Coloring in SQL Prompt is way better. Here is why.

I rarely look at the status bar, but I’m always looking at the top 20% of the screen, so for me, a very visual person, having the color in that top 20% of the screen is much more effective. SQL Prompt provides that for me, by coloring the tab of the query window and adding a line of the same color below the status bar. If you undock the query window, then the whole window is outlined in the tab color.

SQLPrompt6_5a

 

You can create as many environments as you want and you can pick any color…or shade of color that you want. This was one of the improvements to Tab Coloring from when it existed in the Experimental Lab list.

To modify the environments, follow these steps:

  1. Go to the SQL Prompt menu and select Options.
  2. Select Colors from the Tabs list on the left hand side.
  3. Click on the Edit Environments button in the bottom right hand corner, under the table.
  4. Add or modify an environment.
  5. Click on a color to modify the color to meet your needs.
  6. Apply these environments to the servers they represent on the previous window.

SQLPrompt6_5c

 

I’m notorious for saving stored procedures in the Master database in development. To help prevent this, I can set up a Tab Color for a specific database on a specific server. For me, that would be the Master database in dev. I love this!

To do this, follow these steps.

  1. Go to the SQL Prompt menu and select Options.
  2. Select Colors from Tab list on the left hand side.
  3. Click on the Add Server/Database link in the table.
  4. Add a Server Name in the first column.
  5. Add a Database Name in the second column.
  6. Select an Environment for the color in the third column.

SQLPrompt6_5d

Note: In the above example, all three colors are for the same server. This means that when the data is not Utility or Master, then the tab coloring will be green…even for new databases on that server.

In Tab History, you can see which environment each query ran on last. In the example below, I can see that I ran SQLQuery6 in the Master database on SQLDemoMonkey. I know this due to the fact I set up a database specific Tab Color for the Master Database on SQLDemoMonkey. (I guess I should go make sure I didn’t create a stored procedure in that database.)

SQLPrompt6_5b

Intelligent Intellisense for GROUP BY

One of the boring tasks for writing a GROUP BY statement, is re-writing all the columns that you already wrote in the SELECT section of your query. Many times, I’ll copy what I wrote in the SELECT section and paste it under GROUP BY …but then the list still requires editing. Not any more. SQL Prompt now provides an intelligent intellisense for GROUP BY. Take a look:

 

SQLPrompt6_5f

 

In this query I used Ctrl + to get the intellisense to pop up for my GROUP BY. At the top of the list, there is a shortcut for having all the non-aggregated columns inserted for you. Also notice that RequestType from line 42, is not listed at the top of the intellisense box, because it is an aggregated field.

Format Actions and <ctrl> + k, <ctrl> + y

I wrote a post, back in July of 2014, about SQL Prompt formatting shortcuts.  I wrote:

As a power user, my key strokes will look like this when I use them all at once: + bw, bq, bc, ky and finally + e to execute my code. Here is an example of how an ugly duckling turns into a beautiful swan by applying all of these hot keys at one time.

That long list of key strokes is now obsolete. (And there was much rejoicing). I now can select from a list of formatting options that will be applied with only the Ctrl + K, Ctrl + Y shortcut. This is so much easier!

To modify the list of formatting options, follow these steps.

  1. Go to the SQL Prompt menu and select Options.
  2. Select Actions under the Format list on the left.

 

SQLPrompt6_5e

 

It’s  A Wrap

I only listed three of the new features that SQL Prompt 6.5 has. I encourage you to take a look at the Release Notes for the full list. If you don’t have SQL Prompt, then download a free 14 day trial here. If you already have SQL Prompt, then you can go to the SQL Prompt/Help/Check For Updates menu item to get the latest version.

Transaction Log Backups for the Accidental DBA

The Next Quest

In this installment of SQLCoOp, we are writing about backups. I have chosen to focus on Transaction Log Backups. Why? Well, at one of my former jobs, I found myself in the role of Accidental DBA. The company did not have automated backups. They were manually and infrequently performed. I knew this needed to be changed. And this is where my post begins.

MCJJ_ALL

Manual and Frequent Backups

Since I wasn’t a DBA, I started small. I performed manual full backups of our databases often.  I made sure that our backups were stored on a different LUN than our data files. It was working great…until it wasn’t. One day everything stopped.

Lessons Learned

I learned several things that day.

1. The difference between the Full Recovery Model and the Simple Recovery Model is the Transaction Log being generated. We were using a Full Recovery Model because I wanted transactions to be logged.

2. Creating a full backup does not affect the transaction log at all. You have to also create a backup of the transaction log. If you don’t, it will continue to grow until it fills the entire drive. And then your database will stop.

3. If you’ve created a dedicated drive for your log files and they use all the space, then your database will stop. You can’t grow the drive without emptying the drive first.

4. After (finally) getting a backup of the transaction log, the Log file will not change in size, even though it is “empty”. You will need to shrink it.

You see, log files can grow as more space is needed. When a backup of the log file occurs, the log file will stay the same size, but the data inside will be truncated. Once the log file uses all the space again, it will need to grow again. This is normal. The log file only needs to be shrunk when it is taking up the entire drive. Read this article for the technical explanation of how log files work.

5. If you get stuck and no one in the office can help, then go to twitter to ask for help. Use the hashtag #SQLHelp with a short synopsis of why you need help. I spent three hours fixing the situation with the aid of a SQL Community member. Why three hours? Well the drive was full and there was no way to get more space.

Tools For Next Time

Now I am armed with some tools that I can use to help prevent finding my log file drive full and tools to fix the problem when it does happen again.

1. The drive location of the log files needs to be monitored and an alert needs to be setup to email/text you when the drive is almost full. Make sure to test the alert. It won’t do you any good if it can’t get to you.

2. Set the maximum log file size to a specific value and the value needs to be smaller than the drive the log file is residing in. Paul Randal (b|t) has a great article called Choosing Default Sizes for Your Data and Log Files that talks about how to choose log file sizes.

The database will still stop if that limit is reached, but providing the drive isn’t full, you’ll be able to increase the size of log file so that the database can function again. Then you can fix the problem that caused the log file to fill up.

3. Create a large unimportant file. Name it something like ‘DeleteMeInAnEmergency.txt’ and put it on the drive where your log files are stored. This will guarantee that you will have extra room on your drive when a log file fills the drive. You can then move this unimportant file to another drive which will free up space instantly. You can get your database running again and then fix the problem that caused the log file to fill up.

4. Bookmark, memorize, or get a tattoo of this link from Glenn Berry (b|t) called How to Diagnose and Correct a “Runaway” Transaction Log. It has the steps to fix the log file and make it happy again. While you are there, check out his diagnostic queries to learn more about your database servers.

5.  Setup an automated backup plan. This can easily be done with Ola Hallengren’s (b) Backup Maintenance Solution (and it’s free). It is really easy to setup and configure to meet your needs. While you are there, you can also check out his Index Maintenance Scripts. They are wonderful.

NOTE: Always test code from the internet in development and QA before deploying to production.
Don’t Stop Yet

Don’t forget to check out these blog posts by the rest of the SQL CoOp team on the subject of Backups:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next time!!

On a SQL Collaboration Quest

Four SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. Along the way, they will also include other SQL professionals to join in the collaboration.

Original Post: On a SQL Collaboration Quest

Questions Answered From Presentation (Part 2): Creating SSRS Reports Efficiently Through Best Practices

On Tuesday, March 3rd, 2015, I had the privilege of speaking for Pragmatic Works. For the month of March, they are highlighting Women in Tech and have the whole month lined up with female speakers on each Tuesday and Thursday. I was the first in the line up for the month and it ended up being my largest audience to date. There were 419 people listening in. I even saw some comments about there being others who couldn’t make it. Wow

Thanks to everyone who came to listen to my session. While I couldn’t hear anyone, I enjoyed reading the questions and comments that were sent to me afterwards, and let me tell you there were quite a few. Below you can find the answers to SECOND HALF of the questions that were asked. The first half of the questions can be found here.

Questions Answered

1. How do you make the report to see test database or production db or development db?

This is done by changing the data source properties. Say I have a database server called Monkey. I would create a shared data source called Monkey that each of the reports would share. On the Development SSRS Server, I would have the Monkey data source point to the development database. On the QA server, I would have the Monkey data source point to the QA database. And on the Production server, I would have the Monkey data source point to the production database. Each time the report is loaded into the new environment, it would automatically point to the correct database server.

If you are in a situation, where environments have to share a SSRS report manager (and I’m sorry if you are in that situation), then I would create a separate parent folder for each environment. For instance a parent folder called Development and a parent folder called QA. In each of these parent folders, I would create a folder for all your shared data sources for that environment. Each parent folder would have its own copy of the report and its own shared data source called Monkey pointing to the correct database.

2. When copyrighting, does your company need to be registered with the copyright office?

The answer to this question should come from your legal department. Through my experience, the answer is no. Copyright laws are different than patent laws. When I was an artist, the fact I made the artwork first, meant I owned the copyright. Even in another country.

3. Query for getting reports that have not been run.


SELECT

c.ItemID

,c.Path

,c.Name

FROM

ReportServer.dbo.Catalog AS c

LEFT JOIN [ReportServer].[dbo].[ExecutionLog3] AS e ON c.[Path] = e.ItemPath

WHERE

e.ItemPath IS NULL

AND Path NOT LIKE '/Decommisioned Reports%' --Ignore folders I don't care about.

AND c.Type NOT IN (1, 5, 8) -- Ignore Folders, Data Sources, and Data Sets

ORDER BY

c.Path

,c.Name

4. How did you add the template file in Visual studio?

For SSRS 2012, I put the RDL, RSD, and RDS files in the following directory: C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\ ProjectItems\ReportProject. The directory will be different for 2008 and 2014.

5. What if a company doesn’t have any procedure for gathering requirements and creating/keeping documentation?

This puts you in the driver position. I would suggest meeting with your manager to discuss the benefits of having requirements documentation and a standard way of maintaining the documentation. I would then outline a plan for starting on this new path. I would start with new reports, and slowly add in existing reports as they get modified.

Awesome quote from one of the attendees:
I ask “what the decision is going to be taken with the data from the report”

6. How do you know the width to fit on landscape and portrait nicely?

I like half inch margins. So I subtract the two half inch margins from the width of the landscape or portrait paper. That becomes the width of the base report. I also set the half inch margins in the Margin property of the Report.

7. Could you show an example of a Tracking Number?

Sure! If I were using an Excel spreadsheet to keep track of my reports, I would number all my reports starting with 1000. (Why 1000? I like starting with 4 digits. No other reason.) If I were using a product like Team Foundation Server (TFS), then I would have each report represented by a work item. In either case, say the report named National Sales Revenue was represented by the number 1500. 1500 would always represent the report National Sales Revenue.

The first time this report is successfully published to production, it would be given a “release number” of 00. So the tracking number would look like 01500.00. After modifications were made to the report and republished to production, it would get a new release number of 01. The new tracking number would look like 01500.01.

In the location where I store all my documents for my reports, the report National Sales Revenue, will be stored under the number 01500. I usually add the name of the report to the folder name so that it looks something like 01500 – National Sales Revenue. This allows for two things. One, you can create the folder before you have a permanent title. (The temporary folder name would just be 01500.) Two, if the title changes later for some reason, you don’t lose the documentation location. It’s still under 01500.

8. What methodology do you use to create the Tracking Number or is it automatically generated from TFS?

When I’m using TFS, it is automatically generated for me. TFS uses a single sequential string of numbers across all the work item types (task, bug, custom work item, etc.), so you may have a report with number 1500, and then the next report will be number 1583. I do put a leading zero or two in front of the number so that when it goes two 5 digits, my folders are still in the correct order.

9. How do you put in the watermark?

a. Click on the base report.

b. Go to the properties window.

c. Click and open the BackgroundImage property.

d. Set the BackgroundRepeat property to repeat.

e. Set the MIMEType property to the correct format.

f. If you are using an Embedded image, set the following properties.

i. Source = Embedded

ii. Value = iif(First(Fields!EnvironmentID.Value, “GetReportStandards”) <3,”DRAFT”,””)

g. If you are using an image from the database, set the following properties.

i. Source = Database

ii. Value =iif(First(Fields!EnvironmentID.Value, “GetReportStandards”) <3,First(Fields!Draft.Value, “GetReportStandards”),””)

h. If you are using an image from a server location, set the following properties.

i. Source = External

ii. Value =iif(First(Fields!EnvironmentID.Value, “GetReportStandards”) <3,”location of your image”,””)

10. If a report is setup as a subscription, how can I tell if the subscribers are actually using/accessing the report?

This is an excellent question. The ReportServer Database will tell us which reports were executed through a subscription, but it cannot tell us if they were used. I have some ideas on ways to still capture the information, but they would require research and extra layers of complexity. If you want to hear about these ideas, send me an email.

Awesome idea from one of the attendees:
Maybe we should add a “pixel tag” to the report so that hits a web server which can track access.

11. This is not the question you are looking for. Move along. Move along.

12. Is there any way to handle the row hight dynamically in Excel output where include the merged cells? (Can grow does work with merged cells)

I don’t know a definitive answer to this question, but my inclination is to say no. Dynamically changing row height/width outside of the auto grow properties, is usually not possible. I have found ways around this by providing multiple layouts of columns/rows and hiding the ones I don’t want.

13. How can you use a business glossary to develop your label and be consistent across reports?

This question is a big one. The simple answer, is you need buy in from the business to make it happen and discipline from the entire team to make it successful. I would suggest you look into the topics for Data Governance, Business Semantics glossary, and Conceptual Data Modeling. These areas are what would help create and drive the business glossary from the very beginning, so that by the time you are ready to create a report, the terms already exist.

If you can’t get buy in yet, or ever, then I would suggest creating your own business glossary. Excel would work and is easily searchable. Make sure it is stored where your whole team can update it and use it.

14. Any major changes to SSRS 2014?

No. You can see the changes from version to version here. Just select the version you want to look at from the “other versions” drop down at the top of the article.

15. Can you tell what format they placed it in? PDf Excel

The Format Field in the ExecutionLog3 view will give you that information. Here is an example looking at the rendered format of a subscription.


SELECT

ItemPath

,UserName

,RequestType

,[Format]

,ItemAction

,TimeStart

,TimeEnd

FROM

ReportServer.dbo.ExecutionLog3

WHERE

RequestType = 'subscription'

AND ItemAction = 'Render'

16. How long did it take you to build this framework?

It took me a couple of years. I started by tracking down all the reports and entering them in an Excel spreadsheet. My manager wanted a more permanent solution that could be shared with others, so I was given time to create a custom Report Work Item in TFS. I then moved the whole catalog there and decommissioned the Excel based catalog.

For the requirements and design document, I slowly saw a pattern of questions that I kept asking, so I designed a requirements document and tweaked it over time.

For my report templates, I looked at existing reports. I took the consistent layouts I found, filled in some things I felt were missing like showing the parameters, and created the templates. They stayed pretty consistent over the years.

17. Is there a way to find out the limit of data export to Excel? We used to have reports which we can view, but time out when exporting to Excel. The limit is per size of other Excel or number of rows?

The limit will be how long it takes to do the export. There is a time limit and it if the export is not accomplished in that time frame, then it will time out. This can be changed with settings in the Report Manager. As far as the amount of data Excel can take, I would Google that, since different versions have different limits.

18. What’s better to use stored procedure or embedded SQL query and how to determine that?

This is an excellent question. This is also a highly debatable discussion, so this is my stance on the topic.

I prefer stored procedures for the following reasons:

  • Stored procedures are easier to modify when they need to be optimized. It is easier to track down a stored procedure in the cache plan than an inline SQL Statement in a report. You also only need to deploy the updated stored procedure back to production, not the report.
  • Dynamic queries are more likely to be classified as an “Adhoc query” by SQL Server. Depending how your SQL Server properties are set, the Execution Plan may never be kept in the cache, which means you will always have the overhead of creating the best Execution Plan.
  • On the reverse, if your SQL Server properties are set to handle Adhoc queries, then you can cause unnecessary bloating in your cache plan since every embedded query with different parameter values will be seen as a different query. That is not true of Stored Procedures….unless they have dynamic SQL inside them.

19. Would you recommend some sort of standards for Data Source Names?

I’m in favor of standards for everything. It satisfies my OCD needs. I keep all my data sources in one folder called either “Data Sources” or “DataSources”, depending if you like spaces in names. For the names, my data source names are the same name as the database they represent. If you have multiple databases with the same name in your production environment, then I would suggest adding the server name to the data source name. You want the Data Source names to represent what they are pointing to.

Name data sets the same way. I use the name of the stored procedure as the name of the data set. If you have the same stored procedure name in different Schemas, then I would include the Schema name in the data set name.

20. How to deploy the SSRS reports using TFS auto build?

I haven’t had the pleasure of using this feature. I unfortunately had to write a deployment document so that the report(s) could be deployed manually.

21. Can you mention the bug again with 2008 R2 with concatenated fields in excel prints?

Sure. In the footer of the report, I had a single textbox that had multiple fields in it. When the report was exported as an Excel document, the textbox was created in the footer of the Excel document, but the font size had been increased to 72 PT font. This bug did not occur on all of our servers.

22. All these best practices are good. I would like to see some more of practical on how to create hierarchies and links and other stuff, maybe in next webinar.

Please, please, please email me what you would like to see in sessions. I am always looking for ideas.

I have an example of a way to handle a hierarchy in my other Pragmatic Works session called: Scalable SSRS Reports Achieved Through the Powerful Tablix.

You can look at my recorded presentations page to see a list of presentations that have been recorded.

You can also look on my 2015 Speaking Engagements to see where I will be speaking or hope to be speaking this year.

23. No Russian Thank you? 🙁

Спасибо

Thanks for all the fish

Thanks go out to everyone who attended the presentation and for the patience of those who waited two weeks for me to answer the second half of the questions.

This is me sitting at a local café with my oldest daughter while I write, she studies, and we both drink flat white coffee. (This week she was working on an essay about Frankenstein and an exegetical essay for another class.  I’ll stick to SQL and SSRS.)
Mar 2015 Victoria and Me at Cafe

Questions Answered From Presentation (Part 1): Creating SSRS Reports Efficiently Through Best Practices

This past Tuesday, March 3rd, 2015, I had the privilege of speaking for Pragmatic Works. For the month of March, they are highlighting Women in Tech and have the whole month lined up with female speakers on each Tuesday and Thursday. I was the first in the line up for the month and it ended up being my largest audience to date. There were 419 people listening in. I even saw some comments about there being others who couldn’t make it. Wow

Thanks to everyone who came to listen to my session. While I couldn’t hear anyone, I enjoyed reading the questions and comments that were sent to me afterwards, and let me tell you there were quite a few. Below you can find the answers to HALF the questions that were asked. The other half of the questions will be answered next weekend.

Questions Answered

1. The number one question that was asked was: Will your documents, like your requirements document, be available?

Yes! You can find all the assets to the presentation under the Resources\Presentation menu on my website. I list all my presentation on this page, so just scroll to the correct presentation title.

I did get some comments about my SSRS templates missing. That has been fixed. Here are the three separate downloads:

If you missed the presentation, don’t worry. Pragmatic Works will be adding it to their catalog of past presentations here.

2. Hey, can you please share us that sample database?

I’m not sure which sample database this was pertaining to. Please email me.

3.  Would you give that mock up tool again?

It’s called Balsamiq. They are a wonderful company with a great user support forum. You can find tons of templates on their website and get your feet wet with a trial version.

4. Do you have any scripting that can be used to see what subscriptions are set up using all the various parameters?

I broke this question up into two questions because I wasn’t clear if you wanted existing subscriptions or executed subscriptions.

Executed

The ExecutionLog3 view in the report server database will have this information for you. You can filter the data on the RequestType field to show you only subscriptions. The Format field will tell you how it was rendered and the Parameters field will tell you what all the parameters were set too. The down side, is they use special escape characters, so you’ll have to decipher it.


DECLARE
@StartDate DATETIME = '03/1/2015'
,@EndDate DATETIME = '04/5/2015'

SELECT
InstanceName
,ItemPath
,UserName
,RequestType
,ItemAction
,TimeStart
,Status
,Format
,Parameters
FROM
ReportServer.dbo.ExecutionLog3
WHERE
ItemPath NOT LIKE '/DataSets/%'
AND TimeStart >= @StartDate
AND TimeStart <= @EndDate
AND RequestType = 'Subscription'
AND ItemPath <> 'Unknown'

Existing  

For existing subscriptions, you can look in the Subscriptions table in the ReportServer database. This lists the subscriptions that are set up, how they will be rendered, and the parameter values that will be used as well as other information.

5. Is there is way to bulk change the owner of a subscription? We have people leave and then when their account is deleted, the subscriptions stop working.

I’ve been asked this before. The information is stored in the ReportServer database, but I am always cautious about directly modifying this database. So , if you change it, then test it heavily  in development before applying the changes to production.

To help with this problem in the future, I would recommend a new standard. All subscriptions should be directed to Active Directory distribution lists. Even if the distribution list has only one person in it. I try to use existing distribution lists, but when I can’t, then I work with the Active Directory Admin (or whoever wears that hat). You could make a distribution list for a specific title, position, or category of people. I also made sure the description on the distribution list was filled out and included something like, “Do not delete this distribution list. It is used for subscriptions.”

6. What tool do you use for your report catalog?

There are several ways to create a report catalog. I started with an Excel document. After I outgrew that, I ended up creating a custom work item in TFS (Team Foundation Server). This allowed me to catalog the report, keep track of the workflow as the report went through its life cycle, and associate work items and bugs to the report.

Here is a blog post by Ted Gustaf on creating custom Workflows in TFS 2010.

7. Don’t you think that its better to get the Requirements and have them sign off on That ?? Prior to a Mock Up ?

Yes, it would save us some work too. Unfortunately, I found that the end users have a hard time verbalizing what they actually want. I find that the changes they request after seeing the mock-ups are not normally based on scope creep, but on the way they verbalized their needs when they initially gave them. This can be due to them not understanding what they really want, to them, assuming we know their data as well as they do. By creating the mockup, they have something tangible to look at. It’s analogous to telling a realtor that you want a three bedroom house, but the first one they show you is not what you were looking for. Now that you’ve seen the house, you can better articulate what you really want in a house.

8. Did you discuss the data quality requirements with your customers?

This is a very good question, and a delicate one. Some customers are very glad you bring this question up, and some customers assume all the data is accurate and will become concerned with past reports if they think the data is “dirty”.

Here are some scenarios and how I handled them. 

  • I was replacing an existing report that had bad data. This is an example of a delicate situation. My manager ended up coming with me to the meeting and backing me up. I showed samples of raw data under both reports and explained why the data was inaccurate before. Always save the explanation and data in an easy to locate place because they will be back asking again why a previous report is so different than a current report.
  • I had to aggregate data based on key phrases in a text field. This is an example where the customer is expecting to have a dialog about the quality of the data found. I spent some time looking at the raw data that did not match her phrases exactly and provided recommendations on additional phrases.
  • One of the requirements I was given for a report was based on ranges of numbers. For this report, I approached my client about the data I found that did not fall into the ranges he was expecting. In this case, it uncovered the need to validate the data when it was entered, so the client was grateful.

The bottom line is, be careful how and to whom you bring this question up to. It is a very important question and should always be addressed. It just may need to be addressed by your team instead of with the client, directly with the client, or in a very subtle manner with the client.

9. How do you place the Tablix color palette in the template form? And, could you provide, in your blog, info about Microsoft’s suggested methods of maintaining tracking info.

Below are the steps for creating a “color palette” on a new template. For your second question, I don’t know if there is an article from Microsoft for best practices on tracking info. In next weeks post, I have examples of how I did it.

Steps for adding your own color palette to a template

  • Add a table control (Tablix) to the body of a new report.
  • Remove the header row.
  • Make sure there is a column for each color in your color palate.
  • Make the width of each cell 0.2 in.
  • Change the background property of each cell to the name or hexadecimal value of your colors.

DeGraeve.com is a really cool website that can be used to create a color palette from an image.

Awesome quote from one of the attendees:
A formal report request process gives the requestor an opportunity to think closely about what is really wanted. Does not force it to happen, but increases the likelihood that a good and complete request will be done.

10. What do you use for Version Control?

I have used both Team Foundation Server (TFS) by Microsoft and Subversion, which is an open source product. There are a few versions of Subversion out there. I used the Tortoise version.

For maintaining my SQL source code easily, I also use Redgate’s SQL Source Control product. It’s like a bridge between your repository (TFS, Subversion, etc.) and SSMS. I am happy to talk to anyone on Redgate’s products. Just send me an email. (No, I don’t work for them. I’m just addicted to their products.)

11. Is it better to use Visual Studios of Report Builder?

Better is not the word I would use. More Powerful is the adjective I would use. Report Builder only has a subset of features that you have in Visual Studio. It was designed to empower “power users” and to provide “self service BI solutions”. If you are designing reports all the time, I would highly recommend using Visual Studio. To see the power of what you can do in Visual Studio, take a look at my Scalable SSRS Reports Achieved Through the Powerful Tablix. It’s recorded on Pragmatic Works website.

Here’s some additional terminology for you with regards to the different versions of Visual Studio. The version that comes with SQL Server is all that you need to design reports. For SQL Server 2012 and 2014 it’s called SQL Server Data Tools and commonly referred to as SSDT. For SQL Server 2008 and 2008R2 it was called Business Intelligence Development Studio and commonly referred to as BIDS.

12. What’s the major difference of standalone SSRS and the SSRS on SharePoint server?

I have never worked with the SharePoint version of SSRS. From what I know, the major difference is how the metadata for the management of the reports is stored in SQL Server. The SharePoint tables are used instead of the ReportServer database. I don’t think the creation of the reports is any different.

13. Can excel templates be imported or easily developed so that if I export to excel it looks like the data came from excel?

I’m not aware of the ability to bring in an Excel templates as an SSRS template. If the user exports the report using the Excel render type instead of the CSV render type, then the report should look like it came from Excel. You might need to email me an image of what you are trying to accomplish for me to answer this question better.

Next weekend I will answer the rest of the questions from this presentation.

Thanks for all the fish

Thanks go out to Liz Hamilton at Pragmatic Works for moderating my presentation. She is wonderful to work with and I loved the movie trivia that she did prior to the presentation.

This is me sitting at a local café with my oldest daughter sorting, compiling, and answering questions. (I think she’s doing Latin homework. I’ll stick to SQL andSSRS.)Cafe201503

Caribbean: The Final SQL Frontier

20150211_154238These are the voyages of the SQL Cruise 2015 cruisers on the ocean liner Getaway. Its 7 day mission: To seek out new SQL knowledge and new islands; to boldly ask questions that no woman or man asked before.

– Cruiser Mickey Stuewe

Ship’s log, star date 2015.02.07

Dan and I embark on our first day. We plan to explore new islands and learn more about SQL. We meet several of our shipmates during the boarding process. At twelve hundred hours, we met up with our captain, Mr. Tim Ford and his first officer Ms. Amy Ford. While I begin the process of SQL knowledge transfer, Dan decides to get the lay of the ship.

Captain Ford wastes no time on our voyage. Our first class is given while our ship is still docked in port. We have a wonderful presentation by Jeff Lehmann on what is available to us with Amazon Web Services.

That evening we gathered again for an opening party where we were able to meet all the students, instructors, and families (like my husband Dan) that were joining us for the voyage. It was great to meet new people and also to catch up with fellow alumni cruisers like myself. Bill, Joe, and Matisse were all on my SQL Cruise 2 years ago. Matisse and one of the two Patricks were international students.

After the opening party, Captain Ford divided the students into groups for a scavenger hunt sponsored by Amazon Web Services. This is one of my favorite activities. We are not allowed to be in a group where we know anyone. (That meant Dan and I were on different teams.) This is a great way for the students to break the ice and start building friendships. I had so much fun running around taking pictures of the clues we found….and photo bombing the other teams. I am very proud to announce that my team won.

IMG_3910Ship’s log, star date 2015.02.09

The last two days, I have seen some intense training, while Dan has seen some intense relaxation. In the last two days my classmates and I have been in six 1.5 hour sessions with David Klee, Kevin Kline, Jes Borland, and Grant Fritchey leading the way. These classes have been amazing. Why? Not only are they all amazing speakers whom I can listen to for hours, but the class sizes are small and the sessions are longer than what I normally see. Do you know what that means? It means technical dialog with a room full of experts. Can you give me a Boom Shaka Laka?

We also had “office hours” the last two days. We were able to ask our instructors and our fellow classmates’ questions in a relaxed environment. Our office hour topics varied from day to day. Some of the topics I participated in included sharing examples of PowerShell, understanding the consulting world better, and solving problems on VMs.

IMG_3933Ship’s log, star date 2015.02.10

Today was our first day in an island port. We docked in St. Maarten for the day. We took a taxi with some friends to the airport beach where we rented beach chairs and watched the tide come in almost to our toes. It was very relaxing. I did go in the water, which was amazing. We ate lunch in town where our group split into two. My new friend Deidra and I went shopping, while everyone else walked leisurely back to the ship and hung out at a bar near the ship.

Ship’s log, star date 2015.02.11

Today was our second day in at island port. We docked in St. Thomas. This was the only day we set up a planned excursion and I’m so glad we did. Kevin Kline, Jes Borland, Justin Borland, Dan and myself took a sailboat out to St. John’s for some snorkeling and sun bathing. This was an amazing adventure with great memories. The ocean was wonderful, it wasn’t too hot, and we saw sea turtles while snorkeling. On the way back, they served drinks and snacks. I discovered that I like rum. I’ll leave it at that.

When we returned to the ship, we had some more office hours. Dan and I then had dinner with Kevin Kline, David Klee, and the lovely Molly Klee. Since Dan and Molly are not data professionals like the rest of us, I didn’t want us nerding out the whole night. Well, I couldn’t have been more pleased. Both Kevin and David enjoy woodworking, which my husband is very into. This gave Dan the opportunity to interact with my SQL Family, which I was thrilled with.

IMG_3908Ship’s log, star date 2015.02.12

Today was another day at sea, which meant we spent another day in class. This day has been just as amazing with sessions from David Klee and Kevin Kline. Again the sessions were full of great knowledge and interaction of the whole class (especially Jared).

After our sessions, we had our second group dinner. This gave us yet another opportunity to ask questions and get to know more about our classmates and their families.

IMG_3943Ship’s log, star date 2015.02.13

Today was our last day on an island. We docked in Nassau, in the Bahamas. And guess what? We had another amazing day. Thirteen of us walked up to an old fort. Along the way we picked up two non-SQL cruisers. (More the merrier.) Grant Fritchey, Tim Ford, and my husband Dan all REALLY enjoy history, so they loved visiting the fort. On our way back into town we stopped at a local eatery on the beach. They couldn’t have been nicer. They literally built us a seating area out of old upholstered benches, added umbrellas, and served us food and drinks. That was the best.

Afterwards, we broke up into two groups. One going back towards the ship and the other off to see a rum distillery. Guess which group I was in? Since we weren’t part of an organized tour, the grounds were relatively quiet. This allowed us to leisurely stroll around and ask questions. We also sat for a bit and tasted the various rums they produced.

When we returned to the ship, we had a closing party. We were all very sad that our voyage was at an end, but we were all very happy to be spending the last evening together.EveryoneAtSQLCruiseAtEnd

Summary

This was my second SQL Cruise and it will not be my last. Tim and Amy Ford have really made an exceptional environment for learning. The environment is nurturing, fun, intimate, including, and friendly. You get to spend quality time with the instructors AND with your fellow SQL Family. I absolutely love PASS, but I feel very torn about who I will spend the few evenings I have with. On SQL Cruise, it’s a small group of people and you are all stuck on the boat together. Also, there is plenty of time to ask questions. There are many times at conferences where I don’t get to ask my questions because there is a 30 minute wait AFTER class to speak with the instructor. Not so on SQL Cruise. There’s office hours. There’s dinner. There’s the hot tub. There’s breakfast….

Thanks for all the fish

I want to thank Tim and Amy for spending their time putting the cruise together and handling all the logistics so that we have a place to meet on the ship for the classes, office hours, and group dinners. I also want to thank the sponsors who help keep the cost of the cruise down. Thanks go out to Amazon Web Services, SQL Sentry and Red Gate.

This slideshow requires JavaScript.

%d bloggers like this: