Archive for T-SQL

Exploring SQL Prompt 6.5

010The SQL Prompt Team has been working hard at the Redgate campus in England, and they have out done themselves yet again with their latest release, SQL Prompt 6.5. I’m extra excited about this release, due to the fact that several of the features I’ve been wanting for a while are available this time around. In this post, I’m going to go over my top three favorite new features.

Tab Coloring

I was so happy to see this feature move from the Experimental Lab List to the Feature List. It was like watching Pinocchio become a real boy. Not only does this feature behave the way it did when it was on the Experimental Lab List, but they added some additional cool features to it.

So what does it do for you? It colors the tabs of each query window based on environment settings that you set up. You’re probably thinking, “Yeah, Mickey, but SSMS already does that in the status bar for me.” Well, the Tab Coloring in SQL Prompt is way better. Here is why.

I rarely look at the status bar, but I’m always looking at the top 20% of the screen, so for me, a very visual person, having the color in that top 20% of the screen is much more effective. SQL Prompt provides that for me, by coloring the tab of the query window and adding a line of the same color below the status bar. If you undock the query window, then the whole window is outlined in the tab color.

SQLPrompt6_5a

 

You can create as many environments as you want and you can pick any color…or shade of color that you want. This was one of the improvements to Tab Coloring from when it existed in the Experimental Lab list.

To modify the environments, follow these steps:

  1. Go to the SQL Prompt menu and select Options.
  2. Select Colors from the Tabs list on the left hand side.
  3. Click on the Edit Environments button in the bottom right hand corner, under the table.
  4. Add or modify an environment.
  5. Click on a color to modify the color to meet your needs.
  6. Apply these environments to the servers they represent on the previous window.

SQLPrompt6_5c

 

I’m notorious for saving stored procedures in the Master database in development. To help prevent this, I can set up a Tab Color for a specific database on a specific server. For me, that would be the Master database in dev. I love this!

To do this, follow these steps.

  1. Go to the SQL Prompt menu and select Options.
  2. Select Colors from Tab list on the left hand side.
  3. Click on the Add Server/Database link in the table.
  4. Add a Server Name in the first column.
  5. Add a Database Name in the second column.
  6. Select an Environment for the color in the third column.

SQLPrompt6_5d

Note: In the above example, all three colors are for the same server. This means that when the data is not Utility or Master, then the tab coloring will be green…even for new databases on that server.

In Tab History, you can see which environment each query ran on last. In the example below, I can see that I ran SQLQuery6 in the Master database on SQLDemoMonkey. I know this due to the fact I set up a database specific Tab Color for the Master Database on SQLDemoMonkey. (I guess I should go make sure I didn’t create a stored procedure in that database.)

SQLPrompt6_5b

Intelligent Intellisense for GROUP BY

One of the boring tasks for writing a GROUP BY statement, is re-writing all the columns that you already wrote in the SELECT section of your query. Many times, I’ll copy what I wrote in the SELECT section and paste it under GROUP BY …but then the list still requires editing. Not any more. SQL Prompt now provides an intelligent intellisense for GROUP BY. Take a look:

 

SQLPrompt6_5f

 

In this query I used Ctrl + to get the intellisense to pop up for my GROUP BY. At the top of the list, there is a shortcut for having all the non-aggregated columns inserted for you. Also notice that RequestType from line 42, is not listed at the top of the intellisense box, because it is an aggregated field.

Format Actions and <ctrl> + k, <ctrl> + y

I wrote a post, back in July of 2014, about SQL Prompt formatting shortcuts.  I wrote:

As a power user, my key strokes will look like this when I use them all at once: + bw, bq, bc, ky and finally + e to execute my code. Here is an example of how an ugly duckling turns into a beautiful swan by applying all of these hot keys at one time.

That long list of key strokes is now obsolete. (And there was much rejoicing). I now can select from a list of formatting options that will be applied with only the Ctrl + K, Ctrl + Y shortcut. This is so much easier!

To modify the list of formatting options, follow these steps.

  1. Go to the SQL Prompt menu and select Options.
  2. Select Actions under the Format list on the left.

 

SQLPrompt6_5e

 

It’s  A Wrap

I only listed three of the new features that SQL Prompt 6.5 has. I encourage you to take a look at the Release Notes for the full list. If you don’t have SQL Prompt, then download a free 14 day trial here. If you already have SQL Prompt, then you can go to the SQL Prompt/Help/Check For Updates menu item to get the latest version.

Transaction Log Backups for the Accidental DBA

The Next Quest

In this installment of SQLCoOp, we are writing about backups. I have chosen to focus on Transaction Log Backups. Why? Well, at one of my former jobs, I found myself in the role of Accidental DBA. The company did not have automated backups. They were manually and infrequently performed. I knew this needed to be changed. And this is where my post begins.

MCJJ_ALL

Manual and Frequent Backups

Since I wasn’t a DBA, I started small. I performed manual full backups of our databases often.  I made sure that our backups were stored on a different LUN than our data files. It was working great…until it wasn’t. One day everything stopped.

Lessons Learned

I learned several things that day.

1. The difference between the Full Recovery Model and the Simple Recovery Model is the Transaction Log being generated. We were using a Full Recovery Model because I wanted transactions to be logged.

2. Creating a full backup does not affect the transaction log at all. You have to also create a backup of the transaction log. If you don’t, it will continue to grow until it fills the entire drive. And then your database will stop.

3. If you’ve created a dedicated drive for your log files and they use all the space, then your database will stop. You can’t grow the drive without emptying the drive first.

4. After (finally) getting a backup of the transaction log, the Log file will not change in size, even though it is “empty”. You will need to shrink it.

You see, log files can grow as more space is needed. When a backup of the log file occurs, the log file will stay the same size, but the data inside will be truncated. Once the log file uses all the space again, it will need to grow again. This is normal. The log file only needs to be shrunk when it is taking up the entire drive. Read this article for the technical explanation of how log files work.

5. If you get stuck and no one in the office can help, then go to twitter to ask for help. Use the hashtag #SQLHelp with a short synopsis of why you need help. I spent three hours fixing the situation with the aid of a SQL Community member. Why three hours? Well the drive was full and there was no way to get more space.

Tools For Next Time

Now I am armed with some tools that I can use to help prevent finding my log file drive full and tools to fix the problem when it does happen again.

1. The drive location of the log files needs to be monitored and an alert needs to be setup to email/text you when the drive is almost full. Make sure to test the alert. It won’t do you any good if it can’t get to you.

2. Set the maximum log file size to a specific value and the value needs to be smaller than the drive the log file is residing in. Paul Randal (b|t) has a great article called Choosing Default Sizes for Your Data and Log Files that talks about how to choose log file sizes.

The database will still stop if that limit is reached, but providing the drive isn’t full, you’ll be able to increase the size of log file so that the database can function again. Then you can fix the problem that caused the log file to fill up.

3. Create a large unimportant file. Name it something like ‘DeleteMeInAnEmergency.txt’ and put it on the drive where your log files are stored. This will guarantee that you will have extra room on your drive when a log file fills the drive. You can then move this unimportant file to another drive which will free up space instantly. You can get your database running again and then fix the problem that caused the log file to fill up.

4. Bookmark, memorize, or get a tattoo of this link from Glenn Berry (b|t) called How to Diagnose and Correct a “Runaway” Transaction Log. It has the steps to fix the log file and make it happy again. While you are there, check out his diagnostic queries to learn more about your database servers.

5.  Setup an automated backup plan. This can easily be done with Ola Hallengren’s (b) Backup Maintenance Solution (and it’s free). It is really easy to setup and configure to meet your needs. While you are there, you can also check out his Index Maintenance Scripts. They are wonderful.

NOTE: Always test code from the internet in development and QA before deploying to production.
Don’t Stop Yet

Don’t forget to check out these blog posts by the rest of the SQL CoOp team on the subject of Backups:

To follow our quest for SQL knowledge through this collaborative project, follow the #SQLCoOp tag on Twitter.

See you next time!!

On a SQL Collaboration Quest

Four SQL professionals gathered from the four corners of the world to share their SQL knowledge with each other and with their readers: Mickey Stuewe from California, USA, Chris Yates from Kentucky, USA, Julie Koesmarno from Canberra, Australia, and Jeffrey Verheul from Rotterdam, The Netherlands. They invite you to join them on their quest as they ask each other questions and seek out the answers in this collaborative blog series. Along the way, they will also include other SQL professionals to join in the collaboration.

Original Post: On a SQL Collaboration Quest

Questions Answered From Presentation (Part 2): Creating SSRS Reports Efficiently Through Best Practices

On Tuesday, March 3rd, 2015, I had the privilege of speaking for Pragmatic Works. For the month of March, they are highlighting Women in Tech and have the whole month lined up with female speakers on each Tuesday and Thursday. I was the first in the line up for the month and it ended up being my largest audience to date. There were 419 people listening in. I even saw some comments about there being others who couldn’t make it. Wow

Thanks to everyone who came to listen to my session. While I couldn’t hear anyone, I enjoyed reading the questions and comments that were sent to me afterwards, and let me tell you there were quite a few. Below you can find the answers to SECOND HALF of the questions that were asked. The first half of the questions can be found here.

Questions Answered

1. How do you make the report to see test database or production db or development db?

This is done by changing the data source properties. Say I have a database server called Monkey. I would create a shared data source called Monkey that each of the reports would share. On the Development SSRS Server, I would have the Monkey data source point to the development database. On the QA server, I would have the Monkey data source point to the QA database. And on the Production server, I would have the Monkey data source point to the production database. Each time the report is loaded into the new environment, it would automatically point to the correct database server.

If you are in a situation, where environments have to share a SSRS report manager (and I’m sorry if you are in that situation), then I would create a separate parent folder for each environment. For instance a parent folder called Development and a parent folder called QA. In each of these parent folders, I would create a folder for all your shared data sources for that environment. Each parent folder would have its own copy of the report and its own shared data source called Monkey pointing to the correct database.

2. When copyrighting, does your company need to be registered with the copyright office?

The answer to this question should come from your legal department. Through my experience, the answer is no. Copyright laws are different than patent laws. When I was an artist, the fact I made the artwork first, meant I owned the copyright. Even in another country.

3. Query for getting reports that have not been run.


SELECT

c.ItemID

,c.Path

,c.Name

FROM

ReportServer.dbo.Catalog AS c

LEFT JOIN [ReportServer].[dbo].[ExecutionLog3] AS e ON c.[Path] = e.ItemPath

WHERE

e.ItemPath IS NULL

AND Path NOT LIKE '/Decommisioned Reports%' --Ignore folders I don't care about.

AND c.Type NOT IN (1, 5, 8) -- Ignore Folders, Data Sources, and Data Sets

ORDER BY

c.Path

,c.Name

4. How did you add the template file in Visual studio?

For SSRS 2012, I put the RDL, RSD, and RDS files in the following directory: C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\ ProjectItems\ReportProject. The directory will be different for 2008 and 2014.

5. What if a company doesn’t have any procedure for gathering requirements and creating/keeping documentation?

This puts you in the driver position. I would suggest meeting with your manager to discuss the benefits of having requirements documentation and a standard way of maintaining the documentation. I would then outline a plan for starting on this new path. I would start with new reports, and slowly add in existing reports as they get modified.

Awesome quote from one of the attendees:
I ask “what the decision is going to be taken with the data from the report”

6. How do you know the width to fit on landscape and portrait nicely?

I like half inch margins. So I subtract the two half inch margins from the width of the landscape or portrait paper. That becomes the width of the base report. I also set the half inch margins in the Margin property of the Report.

7. Could you show an example of a Tracking Number?

Sure! If I were using an Excel spreadsheet to keep track of my reports, I would number all my reports starting with 1000. (Why 1000? I like starting with 4 digits. No other reason.) If I were using a product like Team Foundation Server (TFS), then I would have each report represented by a work item. In either case, say the report named National Sales Revenue was represented by the number 1500. 1500 would always represent the report National Sales Revenue.

The first time this report is successfully published to production, it would be given a “release number” of 00. So the tracking number would look like 01500.00. After modifications were made to the report and republished to production, it would get a new release number of 01. The new tracking number would look like 01500.01.

In the location where I store all my documents for my reports, the report National Sales Revenue, will be stored under the number 01500. I usually add the name of the report to the folder name so that it looks something like 01500 – National Sales Revenue. This allows for two things. One, you can create the folder before you have a permanent title. (The temporary folder name would just be 01500.) Two, if the title changes later for some reason, you don’t lose the documentation location. It’s still under 01500.

8. What methodology do you use to create the Tracking Number or is it automatically generated from TFS?

When I’m using TFS, it is automatically generated for me. TFS uses a single sequential string of numbers across all the work item types (task, bug, custom work item, etc.), so you may have a report with number 1500, and then the next report will be number 1583. I do put a leading zero or two in front of the number so that when it goes two 5 digits, my folders are still in the correct order.

9. How do you put in the watermark?

a. Click on the base report.

b. Go to the properties window.

c. Click and open the BackgroundImage property.

d. Set the BackgroundRepeat property to repeat.

e. Set the MIMEType property to the correct format.

f. If you are using an Embedded image, set the following properties.

i. Source = Embedded

ii. Value = iif(First(Fields!EnvironmentID.Value, “GetReportStandards”) <3,”DRAFT”,””)

g. If you are using an image from the database, set the following properties.

i. Source = Database

ii. Value =iif(First(Fields!EnvironmentID.Value, “GetReportStandards”) <3,First(Fields!Draft.Value, “GetReportStandards”),””)

h. If you are using an image from a server location, set the following properties.

i. Source = External

ii. Value =iif(First(Fields!EnvironmentID.Value, “GetReportStandards”) <3,”location of your image”,””)

10. If a report is setup as a subscription, how can I tell if the subscribers are actually using/accessing the report?

This is an excellent question. The ReportServer Database will tell us which reports were executed through a subscription, but it cannot tell us if they were used. I have some ideas on ways to still capture the information, but they would require research and extra layers of complexity. If you want to hear about these ideas, send me an email.

Awesome idea from one of the attendees:
Maybe we should add a “pixel tag” to the report so that hits a web server which can track access.

11. This is not the question you are looking for. Move along. Move along.

12. Is there any way to handle the row hight dynamically in Excel output where include the merged cells? (Can grow does work with merged cells)

I don’t know a definitive answer to this question, but my inclination is to say no. Dynamically changing row height/width outside of the auto grow properties, is usually not possible. I have found ways around this by providing multiple layouts of columns/rows and hiding the ones I don’t want.

13. How can you use a business glossary to develop your label and be consistent across reports?

This question is a big one. The simple answer, is you need buy in from the business to make it happen and discipline from the entire team to make it successful. I would suggest you look into the topics for Data Governance, Business Semantics glossary, and Conceptual Data Modeling. These areas are what would help create and drive the business glossary from the very beginning, so that by the time you are ready to create a report, the terms already exist.

If you can’t get buy in yet, or ever, then I would suggest creating your own business glossary. Excel would work and is easily searchable. Make sure it is stored where your whole team can update it and use it.

14. Any major changes to SSRS 2014?

No. You can see the changes from version to version here. Just select the version you want to look at from the “other versions” drop down at the top of the article.

15. Can you tell what format they placed it in? PDf Excel

The Format Field in the ExecutionLog3 view will give you that information. Here is an example looking at the rendered format of a subscription.


SELECT

ItemPath

,UserName

,RequestType

,[Format]

,ItemAction

,TimeStart

,TimeEnd

FROM

ReportServer.dbo.ExecutionLog3

WHERE

RequestType = 'subscription'

AND ItemAction = 'Render'

16. How long did it take you to build this framework?

It took me a couple of years. I started by tracking down all the reports and entering them in an Excel spreadsheet. My manager wanted a more permanent solution that could be shared with others, so I was given time to create a custom Report Work Item in TFS. I then moved the whole catalog there and decommissioned the Excel based catalog.

For the requirements and design document, I slowly saw a pattern of questions that I kept asking, so I designed a requirements document and tweaked it over time.

For my report templates, I looked at existing reports. I took the consistent layouts I found, filled in some things I felt were missing like showing the parameters, and created the templates. They stayed pretty consistent over the years.

17. Is there a way to find out the limit of data export to Excel? We used to have reports which we can view, but time out when exporting to Excel. The limit is per size of other Excel or number of rows?

The limit will be how long it takes to do the export. There is a time limit and it if the export is not accomplished in that time frame, then it will time out. This can be changed with settings in the Report Manager. As far as the amount of data Excel can take, I would Google that, since different versions have different limits.

18. What’s better to use stored procedure or embedded SQL query and how to determine that?

This is an excellent question. This is also a highly debatable discussion, so this is my stance on the topic.

I prefer stored procedures for the following reasons:

  • Stored procedures are easier to modify when they need to be optimized. It is easier to track down a stored procedure in the cache plan than an inline SQL Statement in a report. You also only need to deploy the updated stored procedure back to production, not the report.
  • Dynamic queries are more likely to be classified as an “Adhoc query” by SQL Server. Depending how your SQL Server properties are set, the Execution Plan may never be kept in the cache, which means you will always have the overhead of creating the best Execution Plan.
  • On the reverse, if your SQL Server properties are set to handle Adhoc queries, then you can cause unnecessary bloating in your cache plan since every embedded query with different parameter values will be seen as a different query. That is not true of Stored Procedures….unless they have dynamic SQL inside them.

19. Would you recommend some sort of standards for Data Source Names?

I’m in favor of standards for everything. It satisfies my OCD needs. I keep all my data sources in one folder called either “Data Sources” or “DataSources”, depending if you like spaces in names. For the names, my data source names are the same name as the database they represent. If you have multiple databases with the same name in your production environment, then I would suggest adding the server name to the data source name. You want the Data Source names to represent what they are pointing to.

Name data sets the same way. I use the name of the stored procedure as the name of the data set. If you have the same stored procedure name in different Schemas, then I would include the Schema name in the data set name.

20. How to deploy the SSRS reports using TFS auto build?

I haven’t had the pleasure of using this feature. I unfortunately had to write a deployment document so that the report(s) could be deployed manually.

21. Can you mention the bug again with 2008 R2 with concatenated fields in excel prints?

Sure. In the footer of the report, I had a single textbox that had multiple fields in it. When the report was exported as an Excel document, the textbox was created in the footer of the Excel document, but the font size had been increased to 72 PT font. This bug did not occur on all of our servers.

22. All these best practices are good. I would like to see some more of practical on how to create hierarchies and links and other stuff, maybe in next webinar.

Please, please, please email me what you would like to see in sessions. I am always looking for ideas.

I have an example of a way to handle a hierarchy in my other Pragmatic Works session called: Scalable SSRS Reports Achieved Through the Powerful Tablix.

You can look at my recorded presentations page to see a list of presentations that have been recorded.

You can also look on my 2015 Speaking Engagements to see where I will be speaking or hope to be speaking this year.

23. No Russian Thank you? 🙁

Спасибо

Thanks for all the fish

Thanks go out to everyone who attended the presentation and for the patience of those who waited two weeks for me to answer the second half of the questions.

This is me sitting at a local café with my oldest daughter while I write, she studies, and we both drink flat white coffee. (This week she was working on an essay about Frankenstein and an exegetical essay for another class.  I’ll stick to SQL and SSRS.)
Mar 2015 Victoria and Me at Cafe

Questions Answered From Presentation (Part 1): Creating SSRS Reports Efficiently Through Best Practices

This past Tuesday, March 3rd, 2015, I had the privilege of speaking for Pragmatic Works. For the month of March, they are highlighting Women in Tech and have the whole month lined up with female speakers on each Tuesday and Thursday. I was the first in the line up for the month and it ended up being my largest audience to date. There were 419 people listening in. I even saw some comments about there being others who couldn’t make it. Wow

Thanks to everyone who came to listen to my session. While I couldn’t hear anyone, I enjoyed reading the questions and comments that were sent to me afterwards, and let me tell you there were quite a few. Below you can find the answers to HALF the questions that were asked. The other half of the questions will be answered next weekend.

Questions Answered

1. The number one question that was asked was: Will your documents, like your requirements document, be available?

Yes! You can find all the assets to the presentation under the Resources\Presentation menu on my website. I list all my presentation on this page, so just scroll to the correct presentation title.

I did get some comments about my SSRS templates missing. That has been fixed. Here are the three separate downloads:

If you missed the presentation, don’t worry. Pragmatic Works will be adding it to their catalog of past presentations here.

2. Hey, can you please share us that sample database?

I’m not sure which sample database this was pertaining to. Please email me.

3.  Would you give that mock up tool again?

It’s called Balsamiq. They are a wonderful company with a great user support forum. You can find tons of templates on their website and get your feet wet with a trial version.

4. Do you have any scripting that can be used to see what subscriptions are set up using all the various parameters?

I broke this question up into two questions because I wasn’t clear if you wanted existing subscriptions or executed subscriptions.

Executed

The ExecutionLog3 view in the report server database will have this information for you. You can filter the data on the RequestType field to show you only subscriptions. The Format field will tell you how it was rendered and the Parameters field will tell you what all the parameters were set too. The down side, is they use special escape characters, so you’ll have to decipher it.


DECLARE
@StartDate DATETIME = '03/1/2015'
,@EndDate DATETIME = '04/5/2015'

SELECT
InstanceName
,ItemPath
,UserName
,RequestType
,ItemAction
,TimeStart
,Status
,Format
,Parameters
FROM
ReportServer.dbo.ExecutionLog3
WHERE
ItemPath NOT LIKE '/DataSets/%'
AND TimeStart >= @StartDate
AND TimeStart <= @EndDate
AND RequestType = 'Subscription'
AND ItemPath <> 'Unknown'

Existing  

For existing subscriptions, you can look in the Subscriptions table in the ReportServer database. This lists the subscriptions that are set up, how they will be rendered, and the parameter values that will be used as well as other information.

5. Is there is way to bulk change the owner of a subscription? We have people leave and then when their account is deleted, the subscriptions stop working.

I’ve been asked this before. The information is stored in the ReportServer database, but I am always cautious about directly modifying this database. So , if you change it, then test it heavily  in development before applying the changes to production.

To help with this problem in the future, I would recommend a new standard. All subscriptions should be directed to Active Directory distribution lists. Even if the distribution list has only one person in it. I try to use existing distribution lists, but when I can’t, then I work with the Active Directory Admin (or whoever wears that hat). You could make a distribution list for a specific title, position, or category of people. I also made sure the description on the distribution list was filled out and included something like, “Do not delete this distribution list. It is used for subscriptions.”

6. What tool do you use for your report catalog?

There are several ways to create a report catalog. I started with an Excel document. After I outgrew that, I ended up creating a custom work item in TFS (Team Foundation Server). This allowed me to catalog the report, keep track of the workflow as the report went through its life cycle, and associate work items and bugs to the report.

Here is a blog post by Ted Gustaf on creating custom Workflows in TFS 2010.

7. Don’t you think that its better to get the Requirements and have them sign off on That ?? Prior to a Mock Up ?

Yes, it would save us some work too. Unfortunately, I found that the end users have a hard time verbalizing what they actually want. I find that the changes they request after seeing the mock-ups are not normally based on scope creep, but on the way they verbalized their needs when they initially gave them. This can be due to them not understanding what they really want, to them, assuming we know their data as well as they do. By creating the mockup, they have something tangible to look at. It’s analogous to telling a realtor that you want a three bedroom house, but the first one they show you is not what you were looking for. Now that you’ve seen the house, you can better articulate what you really want in a house.

8. Did you discuss the data quality requirements with your customers?

This is a very good question, and a delicate one. Some customers are very glad you bring this question up, and some customers assume all the data is accurate and will become concerned with past reports if they think the data is “dirty”.

Here are some scenarios and how I handled them. 

  • I was replacing an existing report that had bad data. This is an example of a delicate situation. My manager ended up coming with me to the meeting and backing me up. I showed samples of raw data under both reports and explained why the data was inaccurate before. Always save the explanation and data in an easy to locate place because they will be back asking again why a previous report is so different than a current report.
  • I had to aggregate data based on key phrases in a text field. This is an example where the customer is expecting to have a dialog about the quality of the data found. I spent some time looking at the raw data that did not match her phrases exactly and provided recommendations on additional phrases.
  • One of the requirements I was given for a report was based on ranges of numbers. For this report, I approached my client about the data I found that did not fall into the ranges he was expecting. In this case, it uncovered the need to validate the data when it was entered, so the client was grateful.

The bottom line is, be careful how and to whom you bring this question up to. It is a very important question and should always be addressed. It just may need to be addressed by your team instead of with the client, directly with the client, or in a very subtle manner with the client.

9. How do you place the Tablix color palette in the template form? And, could you provide, in your blog, info about Microsoft’s suggested methods of maintaining tracking info.

Below are the steps for creating a “color palette” on a new template. For your second question, I don’t know if there is an article from Microsoft for best practices on tracking info. In next weeks post, I have examples of how I did it.

Steps for adding your own color palette to a template

  • Add a table control (Tablix) to the body of a new report.
  • Remove the header row.
  • Make sure there is a column for each color in your color palate.
  • Make the width of each cell 0.2 in.
  • Change the background property of each cell to the name or hexadecimal value of your colors.

DeGraeve.com is a really cool website that can be used to create a color palette from an image.

Awesome quote from one of the attendees:
A formal report request process gives the requestor an opportunity to think closely about what is really wanted. Does not force it to happen, but increases the likelihood that a good and complete request will be done.

10. What do you use for Version Control?

I have used both Team Foundation Server (TFS) by Microsoft and Subversion, which is an open source product. There are a few versions of Subversion out there. I used the Tortoise version.

For maintaining my SQL source code easily, I also use Redgate’s SQL Source Control product. It’s like a bridge between your repository (TFS, Subversion, etc.) and SSMS. I am happy to talk to anyone on Redgate’s products. Just send me an email. (No, I don’t work for them. I’m just addicted to their products.)

11. Is it better to use Visual Studios of Report Builder?

Better is not the word I would use. More Powerful is the adjective I would use. Report Builder only has a subset of features that you have in Visual Studio. It was designed to empower “power users” and to provide “self service BI solutions”. If you are designing reports all the time, I would highly recommend using Visual Studio. To see the power of what you can do in Visual Studio, take a look at my Scalable SSRS Reports Achieved Through the Powerful Tablix. It’s recorded on Pragmatic Works website.

Here’s some additional terminology for you with regards to the different versions of Visual Studio. The version that comes with SQL Server is all that you need to design reports. For SQL Server 2012 and 2014 it’s called SQL Server Data Tools and commonly referred to as SSDT. For SQL Server 2008 and 2008R2 it was called Business Intelligence Development Studio and commonly referred to as BIDS.

12. What’s the major difference of standalone SSRS and the SSRS on SharePoint server?

I have never worked with the SharePoint version of SSRS. From what I know, the major difference is how the metadata for the management of the reports is stored in SQL Server. The SharePoint tables are used instead of the ReportServer database. I don’t think the creation of the reports is any different.

13. Can excel templates be imported or easily developed so that if I export to excel it looks like the data came from excel?

I’m not aware of the ability to bring in an Excel templates as an SSRS template. If the user exports the report using the Excel render type instead of the CSV render type, then the report should look like it came from Excel. You might need to email me an image of what you are trying to accomplish for me to answer this question better.

Next weekend I will answer the rest of the questions from this presentation.

Thanks for all the fish

Thanks go out to Liz Hamilton at Pragmatic Works for moderating my presentation. She is wonderful to work with and I loved the movie trivia that she did prior to the presentation.

This is me sitting at a local café with my oldest daughter sorting, compiling, and answering questions. (I think she’s doing Latin homework. I’ll stick to SQL andSSRS.)Cafe201503

T-SQL Tuesday #62 – Primary Key Constraints Are Good For Your Database

Robert Pearl (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009.

This month’s invitation is about making sure your database is healthy. For me, that is not just making sure that patches are updated and backups restore properly. It also means the database schema is healthy.

In the beginning

One of the simplest things that we can do when creating a new table, is to make sure there is a primary key constraint. A primary key constraint is simply a specialized index. It can be a clustered or non-clustered index.  The important part is, it is created as a primary key constraint.

Why?

Two strong reasons come to mind on why you want to have primary key constraints on your tables. The first, is the accuracy of your data. The purpose of a primary key is to uniquely identify a record (just like your fingerprints). By using primary key constraints, SQL Server will guarantee that the primary keys are unique.

I’ve had application developers argue with me that the application will guarantee that the primary key is unique, so a primary key constraint is not needed. I’ve also had other developers tell me that using GUIDS or identities will guarantee uniqueness. Even if you use these data types or have checks in your application to verify the value is unique, it is still possible to go around the application to insert data and it is possible for multiple applications to simultaneously provide what they think is a unique value. A primary key constraint can will prevent duplication.

This does bring me to the second reason. Say, you do a have a spectacular way to keep all of your primary keys unique. Without primary key constraints, you can’t implement Transactional Replication. That’s right. Transactional Replication requires primary key constraints.

How can I find my straying tables?

Now that I’ve (hopefully) peeked your curiosity, let’s look at a way to find a list of tables that are missing primary key constraints.

SELECT
     t.name AS TableName
FROM
     sys.tables AS t
     LEFT JOIN sys.indexes AS i ON i.object_id = t.object_id
          AND is_primary_key = 1
WHERE
     is_primary_key IS NULL
ORDER BY
     t.name

Remember it is not a good idea to “just add” new constraints without determining the impact on the current system. Once you have the list of tables that are missing primary key constraints, you can go through the process of determining if you can add them into the environment without side affects. Some side affects can include having to deal with duplicate data, increase database size by the addition of new indexes, and better performance.

Thanks for all the fish

Thanks go out to Robert Pearl for hosting this month’s T-SQL Tuesday blog party.

SELECT Thankfulness FROM Person.Person

ThankYou500Hi. I’m with Bob

I love how blog posts can cascade into each other. I follow Chris Yates’ (b|t) blog. He wrote a post that started with Bob Pusateri (b|t). And here I am joining the blog party. Hopefully you too will join the blog party and write about something you are thankful for.

I’m thankful for…

I’m thankful for a lot and not appreciative of enough. One of the non-SQL things that I’ve learned more about is the world. Some of my SQL Friends have gotten to where they are at the hard way, through the school of hard knox. Every time I hear another story, it makes me think of how lucky I have been. I am very thankful for the sobering stories.

I’m thankful for my Holistic Nutritionist, Kristi Acuna (b). Without her, I could not be an active member in the SQL Community. I started seeing her in 2009. She immediately found foods that I was eating that my body did not function well on. Things like gluten, tomatoes, potatoes, etc. As I slowly took those foods out of my diet, I started to have more energy. Enough energy to start things like blogging on the weekends, attending SQL Saturdays, speaking at conferences, and staying up late to network after the conferences. Without her, you wouldn’t know me.

I’m thankful for my husband, Dan. He supports me and encourages me to be me. For some of you that may seem obvious. Why would he not allow me to be me? But not all are so lucky. I know men and women a like who participate in local SQL Saturdays only, or in user groups occasionally because their spouse needs them at home. Dan encourages me to write. He encourages me to lead my user group. He encourages me to present at conferences. In fact, when I was really bummed about work this past summer. His first question was, “When do you see your SQL Family again?” How cool is that?! He knows I’m an extrovert and he makes sure I get that interaction.

I’m thankful for this AMAZING SQL Community we call SQL Family. The first half of my career I was a Visual Basic programmer. VB 3.0 through VB.Net. I left IT twice because of the way I was treated by co-workers. I came back kicking and screaming. I was blessed to land in the world of SQL and to find an amazing community who accepts, supports, and encourages each other.

Down to the details

I want to thank some specific people, but some of them are private people. So I’m going to send out some digital “thank you cards”. I encourage you to do the same. Say thank you to the people who have touched your lives and made you who you are today.

Thanks for all the fish

Thanks go out to Bob Pusateri (b|t) for this great idea.

T-SQL Tuesday #60 – Stretching Past My Comfort Zone

My good friend Chris Yates (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009.

This month’s invitation is about something we’ve learned recently. Being that I just got back from Summit, I wasn’t quite sure what to pick. I ended up picking something prior to Summit. I learned how to recover from my Transaction Log filling up the disk via our SQL Community.

Silly rabbit, #SQLHelp is not just for kids

Hashtags have a WIDE variety of uses. They are used to express emotions, categorize tweets, to follow global discussions, and even to state the obvious. But my favorite use is to get help from our community with the use of #SQLHelp, #SSRSHelp, and others. And this is where my story begins… Freaking out and going immediately to #SQLHelp.

I am not a DBA. I’m a BI girl. I’m a Data Modeler. I’m a Database Developer. But please, oh please, don’t ask me to troubleshoot a serious problem on a failover cluster. That is not my cup of tea… and that is exactly what I found myself doing one Wednesday afternoon.

Before the internet, I would have been completely helpless. Today I’m not. Why? Because I have an entire community behind me. One that does not think twice before lending a hand. I posted a request on twitter requesting assistance for my problem and I included the #SQLHelp hashtag. Less than a minute passed before I had TWO community members offering to help. Wow! (I did take the help of only one of them, since it would have been confusing to have them both help.) He was kind enough to answer my questions and walk me through the solution over the course of a few hours. That is what I call Community. That is what I call #SQLFamily.

Earning my DBA card

I now know how to fix this problem. I even had the “opportunity” to try out my new knowledge a few weeks later due to some deadlocks and the lack of email notification setup on the our monitoring software. (Don’t ask.)

Lesson 1

I thought that a full backup also took care of the transactions stored in the Transaction Log. This is false. In order to have your Transaction Log truncated, you have to create Transaction Log backups. (Lesson Learned)

Lesson 2

Once the disk is full, you can’t make a backup of the transaction log. To get around this, you add a new, tiny, empty log file to the database. I don’t know the technical reasoning, but this will allow a backup to be created, thus truncating the very full log file. After which you can shrink the really big original log file to a reasonable size. Finally, you can remove the tiny, empty log file that you temporarily created.

When troubleshooting a log file that has filled up a disk on a Failover Cluster, you can only create a new log file on a shared resource. I had none. Luckily, I had a brand new database on that server. The database had been released into production, but was not used. Yet I was able to remove it from the server and create a new transaction log in its place. (And there was much rejoicing.)

Lesson 3

Fast forward to SQL Saturday Portland, right before Summit. I took a class from Randy Knight (b|t). His presentation was called Locks, Blocks, and Deadlocks Oh My! In his session he spoke about how the transaction log can get filled during a deadlock (been there, done that, got the T-shirt). He suggested creating some fake files on the drive to reserve space for when that awful day comes and your disk is full. I had been wondering if that would be a good solution. Not only did he confirm that it was, but he gave a great way to quickly create five 1G files that can be removed as needed to get the server back on its feet.

Conclusion

The moral of this story is not only to have Transaction Log Backups as well as Full Backups, but to also nurture your network. Your network is like gold. They are there to lift you up when you need it and to help you through technical challenges that are outside of your comfort zone.

Thanks for all the fish

Thanks go out to Chris Yates for hosting this month’s T-SQL Tuesday blog party. Thanks also go out to all the community members who quickly respond to people in need.

Blogger Q & A

PASS Summit 2014 Keynote – Evolving Microsoft’s Data Platform – The Journey to Cloud Continues

SQL in the City Keynote – LIVE BLOG

%d bloggers like this: