T-SQL Tuesday #46 – And I shudder

Rick Krueger (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009.  The topic this month is a fun one. He has asked us to write about a Rube Goldberg type of solution (aka a hack job) we created at some point in our careers.

Like many of you I worked at a Start-up back in 2000, but I wasn’t a database developer back then. Back then I was a web developer who specialized in the business layer which I wrote in VB and later VB.net. Back then I never thought about consulting with the DBA to see if my idea was good idea from a database point of view. I can tell you that I shudder at the mere thought of the “brilliant” solution I came up with back then…It was actually a great idea…until you put in a database and queried it.

The Plan

The company I worked for was building a shopping cart from the ground up. We needed a way to keep track of the state of each item in an Order and display that information at the Order level. My idea was to keep track of each Order Item state with an alphanumeric letter and we would string them all together into one field at the Order level so that we could query for Orders with Items at a particular state. (This is where we all shudder.)

When I thought of the idea, I thought there would only be a couple of letters. R for return, O for ordered, C for Canceled, etc. As we started going through all the permutations we found a ton of exceptions. I think by the time I handed the responsibilities over for this one field, we had almost 10 different designations.

The Flaw

Since (back then) I could read execution plans as well as I could read Japanese (which means not at all), I had no idea the impact of my design. I didn’t know that indexes on that field would become pretty much useless since a Like operator would have to be used to find a particular part of  the concatenated field.

A better approach for this same solution would be to leave the designations at the item level and for reporting concatenate the values at the Order level.

Second Verse, Same as the First

Since I never learned at the first company that my idea was flawed from a database perspective, I pitched the same plan at the next company…where we implemented it…again. I have learned my lesson since then and advocate to all .net developers that working with the DBA or DBD from the beginning can help achieve a better application.

Thanks for all the fish

Thanks go out to Rick Krueger for hosting this month’s T-SQL Tuesday blog party. Please visit his website at http://www.dataogre.com.

Fall Speaking Engagements

A year ago I had no intentions of speaking this year. I didn’t think I was mentally ready for public speaking once again. Boy was I wrong. Not only was I ready, but I’ve become addicted to speaking at various SQL Events. I even found a SQL Saturday to speak at on my family vacation in Louisville, Kentucky. Between September and October I will be speaking at 5 venues and moderating two Women in Technology (WIT) panels. That will be a total of 8 sessions. Here is a bit about each of them.

SQL Saturday #249 takes place in San Diego, CA on Saturday, September 20th. This year, instead of a pre-con the day before , Red Gate will be hosting a half day called SQL in the City Seminar.

Dev Connections takes place in Las Vegas, NV between September 30th and October 4th at Mandalay Bay. I’m very excited about my two brand new presentations.

SQL in the City takes place in three different cities in October. I will be speaking at two of them. I’ll be in Pasadena, CA on October 9th and I’ll be in Charlotte, NC on October 14th the week of the PASS Summit.

PASS Summit takes place in Charlotte, NC between October 15th and 18th. This is the most amazing conference for the SQL community. While I’m not speaking at it this year, I am participating in other ways including being a buddy to a new attendee or two.

SQL Saturday #237 also takes place in Charlotte, NC on Saturday October 19th.

  • This is my last SQL Saturday of the year and it is the “BI Edition”, so I had to do my session, Scalable SSRS Reports Achieved Through the Powerful Tablix.

I’m looking forward to these sessions and to meeting new people as well as reconnecting with my old friends. I hope to find you in my sessions and that you have the time to introduce yourself. I really enjoy meeting new people and sharing the knowledge that I have.

SQL on my friends.

T-SQL Tuesday #45 Summary–Follow the Yellow Brick Road

This is the summary of blog posts for T-SQL Tuesday #45 – Follow the yellow Brick Road. I was the lovely host this month with the topic being Auditing. As always, I enjoyed all the posts. It reminds of what Forest Gump once said:

 “It’s like a box of chocolates. You never know what you’re going to get.”

So, let’s meet up with Dorothy to find out what kind of auditing related advice and stories she found on her journey.

TSQLTuesday45 - YellowBricks

Summary

Jason Brimhall had a wonderful post about using the information that SQL Server automatically gathers for us. He used the  sys.FN_TRACE_GETINFO function to find out when and who executed the DBCC SHRINKFILE command  in his environment.

Rob Farley wrote a thought-provoking post about how you should think about your auditing data you do have and how you shouldn’t make excuses when you don’t have it.

Boris Hristov wrote a piece about the impact of external auditing entities on our activities. In this case, it was a story about unintentionally scaring the client about database security levels and how that could have impacted a SOX audit. Side note: I always enjoy Boris’ enthusiasm about T-SQL Tuesday. (keep it up Boris. Smile )

Chris Yates wrote a great summary piece about what to audit and a list of various tools to use to accomplish the auditing.

Steve jones wrote a wonderful piece on why auditing is important and how he caught a “cowboy coder”.

Steven Ormrod wrote an excellent piece on how to use a DDL Trigger to not only find out who modified the database, but from what IP address and application they did the deed. This is definitely one to bookmark.

Glenda Gable shares with us how she learned about writing triggers that audit. She then goes on to share with us how she had an opportunity (and seized it) to learn and implement Change Data Capture (CDC).

Jeffrey Verheul wrote about taking the initiative at a new company and creating a baseline on the system health of the 70 databases he was responsible for… and it paid off in a relatively short amount of time.

Dave Green wrote a thought-provoking post about what to do with the audit data during a data migration. I particularly enjoyed this post, since I get to ask my clients those very same questions next week. (Can we say perfect timing?)

Robert L Davis wrote about a type of auditing that I hadn’t even considered, and I’m happy he did. He wrote about a new feature in SQL Server 2012. SQL Server 2012 will automatically audit the execution of your SSIS package. He goes on to share how he was able to use the feature.

Mickey Stuewe (that would be me), wrote about a simple way to audit at the data level. She also made a plea at the end of her post to all the women bloggers out there. She asked them to contact Adam Machanic (b|t)  about being a T-SQL Tuesday blog party host.

…And That’s A Wrap

Thank you to the 11 bloggers who participated this month. We couldn’t have T-SQL Tuesday without you. Thanks also to Adam Machanic who put this brilliant idea in action and makes sure it continues each month. … Only 14  more days until the next invitation goes out.

T-SQL Tuesday #45–Follow the Yellow Brick Road

This month I’m the lovely host of the T-SQL Tuesday blog party which Adam Machanic (b|t) in December of 2009.  The topic this month is Auditing. I chose this topic because auditing, in some form, is needed for all databases (IMO).

Was it the lion, tiger or bear? Oh my!

One of the databases I use to manage had a schema where each table had four fields to keep track of who originally created the record and who last modified the record. The date was also stored for these two events. These four fields were populated by the .Net application that used the table.

TSQLTuesday45 - Table

We found, over time that this solution had a couple of holes.

1. Values were missing when someone inserted or updated data directly in the database.
2. Values were missing when the .Net programmer forgot to add code to populate the fields.

Luckily, the solution was a quick and easy one.

First, defaults were set on the InsertedBy and InsertedDts fields, SUSER_NAME() and GETDATE() respectively. This provided the user’s account that was used to access the table and the current date.

Second, an update trigger was created to populate ModifiedBy and ModifiedDts. This allowed the application to insert values if they were provided, but default values were available when they weren’t. More importantly, when someone modified the data directly, their login was captured.

CREATE TRIGGER dbo.trg_ClientOrder_iu ON dbo.ClientOrder
	AFTER UPDATE
AS 
	SET NOCOUNT ON 

	UPDATE dbo.ClientOrder 
	SET
		ModifiedDTS = ISNULL(i.ModifiedDTS, GETDATE())
		,ModifiedBy = ISNULL(i.ModifiedBy, SUSER_NAME())
	FROM
		dbo.ClientOrder AS tb
		INNER Join inserted AS i ON tb.ClientOrderID = i.ClientOrderID
GO

Here you can see that when I inserted the first two records, my login was captured. On the third record I provided values for InsertedBy and InsertedDts.

TSQLTuesday45 - Data1

Next I updated the first record without providing a ModifiedBy or ModifiedDts. On the second row, I did provide a ModifiedBy and ModifiedDts with my update.

TSQLTuesday45 - Data2

This simple solution allows you visibility into who and when data was changed. While this is not a comprehensive solution, it did solve questions that we had over the years.

Thanks for all the fish

Usually this is where I thank the host, but as Spock would say, “That would be self-serving.” So instead I am making a plea to all the amazing female bloggers out there. You know who you are. I want you to ask Adam Machanic (b|t) if you can host the T-SQL Tuesday blog party. Why am I making this plea? I’m glad you asked.

Being a data geek, I went through all the past T-SQL Tuesday invitations, which Steve Jones (b|t) so graciously keeps up to date. (Thanks Steve!) I found that out of the 45 invitations, only FIVE hosts were women.  The other amazing fact was that three of us didn’t host until this year!

  • Jes Schultz Borland (b|t)
  • Erin Stellato (b|t)
  • Jen McCown (b|t)
  • Wendy Pastrick (b|t)
  • And yours truly

Ladies, that is only an 11% representation. Please, please, please email Adam Machanic (b|t) and ask him if you can host. The rules are here.

Ok, I’m getting off my soap box. 🙂

T-SQL Tuesday #45 Invitation–Follow the Yellow Brick Road

This month marks the 45th month in Adam Machanic’s (b|t) T-SQL Tuesday blog party which started in December of 2009. Each month an invitation is sent out on the first Tuesday of the month inviting bloggers to participate in a common topic. On the second Tuesday of the month all the bloggers post their contribution to the event for everyone to read. The host sums up all the participant’s entries at the end of the week. This month I’m the host and the topic is …

Auditing

An audit trail is needed for various reasons. Some companies need it for compliance, others need it to find out who “accidently” did something stupid last week, and some specialized audit trails can tell you how the data has changed over time.

So, it is time to follow Dorothy and Toto down the yellow brick road and to share your experience with auditing data. If you are new to the T-SQL Tuesday blog party and need some ideas, here are a few:

  • How to implement SQL Server Audit which was introduced in SQL 2008.
  • Your favorite audit pattern.
  • Your worst experience with an implementation of a bad auditing pattern.

Pay no attention to that man behind the curtain!

The Great and Powerful Oz has spoken! … and these are the rules.

Rule 1: Make sure that you include the T-SQL Tuesday image at the top of the post which will help identify your post as a T-SQL Tuesday blog post.  Please include a link back to this invitation too.

Rule 2: Sometime next Tuesday using GMT, here’s a link to a GMT time convertor, publish your blog post.  For example in California, that would cover 5 pm Monday to 5 pm (PDT) Tuesday.

Rule 3: Come back here and post a link in the comments so that I can find all the posts for the round up.

Rule 4: Don’t get yourself fired. Make sure that you either generalize your post or get permission to blog about anything from work.

Rule 5: If you roam the Twitterverse, then don’t forget to Tweet about your blog post with the hashtag #tsql2sday.

Rule 6: Go read someone else’s blog on the subject!

Final Rule: Have fun!

 

Spreading your Wings: Seven Tips to Help Find a New Job

ButterFlyJust as a butterfly takes time to change from a caterpillar, I too have been changing this year. I had a set of goals that started forming last July and were solidified by December. Now I’m ready to take flight on one of those goals.

I started looking for a new job last July, but I realized I wasn’t quite ready to leave my team. In February I started actively looking again. I have spent the last five moths looking and I can now say I am the proud owner of a cool new job. I start right after my two week vacation which starts today. (And there was much rejoicing.)

I will miss my current team, but my new job is close enough to still have lunch with them and they do attend the OC SQL User Group meetings.

I wanted to share with you my journey and the seven tips I found most helpful.

Tip 1: I made a list and I checked it twice

I’m a detail person with OCD tendencies. So I made a list. As I’ve mentioned before that I like mind maps, so I created one for my dream job. It had everything I wanted, including things like the size of the company, the size of the team, and things I didn’t want to be responsible for. (Knowing what you don’t want to be is as important as knowing what you do want to be.)

Here is a list of things to consider for your list:

  • What is the ideal company size?
  • What is the ideal team size?
  • How much overtime are you willing to endure?
  • Are you willing to support a 24 hour uptime?
  • What parts of SQL Server or other products do you want to work with?
  • What parts of SQL Server or other products do you not want to work with?
  • How old can the technology be?
  • What are you willing to give up so that you can have other items from your wish list for new job?
  • What kind of commute are you willing to endure?

Tip 2: I found a Mentor

I am a strong proponent of a having ( or being ) a mentor and I blogged about it here. I knew I was going to start looking for a new job in February, so I started looking for a mentor in December. That allowed me to find one that could help me figure out how to get a new job. You see, this is the FIRST job where I didn’t have an “in”. I knew someone at all the other places I’ve worked.

Here are some of things my mentor helped me with:

  • He gave me advice on where to look for jobs.
  • He gave me advice on areas I should read up on that would help round out my current skill set.
  • He also kept reminding me to blog. This was probably the best advice he gave, because it came up in my interview.

I wanted to make sure I was qualified for the dream job I wanted so I had him quiz me. (Yes, I passed.)

Tip 3: I signed up for job alerts

I found that Indeed.com had the best search engine. What they do is they get feeds from other job search websites and compile them into a single list. You can set up several alerts and they’ll email you with new entries that match your criteria. Other websites have this same feature, but I found them to be inaccurate. One of them actually sent me jobs that had absolutely nothing to do with my skill set.

Here are some tips:

  • I created a temporary email address just for recruiters and the job search websites. Now that I have a job I don’t have to worry about my normal email being flooded by recruiters who are still trying to find me a job.
  • I created job search alerts on multiple sites, but Indeed.com was the best.
  • I created a job search that had keywords I wanted (like SQL and Database Developer) and I excluded words in the search as well (Oracle is not for me).
  • I created an additional job search that also included a minimum salary. This was always my first email to read, and it’s the email that I found my new job.

Tip 4: I networked

This might be hard for some people, but networking is a great tool. You can network with your past teammates through lunches or LinkedIn. You can attend user group meetings and talk to the other attendees. Here is a link to find a local PASS chapter. Networking also helps you refine your wish list and weed out things you hadn’t considered, but came up in conversations. While networking didn’t find me my new job, I did get to spend time with past colleagues and they did try to get my foot in the door. It just wasn’t meant to be.

Tip 5: I blogged about my everyday accomplishments

My mentor was very adamant about this, and I’m so glad he was. If he hadn’t seen a post from me in two weeks, then he reminded me to write one… on anything. I did make sure I participated in in the T-SQL Tuesday blog party every month. I found the best way to find out who the current host of the blog part was to look up the twitter hash tag, #TSQL2sday.

Why, you may ask, was this such a great idea? Well, it was brought up in my interview. “Mickey, I was going to ask you about sub-queries, but I read about it in your latest post.” There was at least one other interview question covered on my blog as well. (That was really awesome by the way.)

It also gives you a great reference for your abilities.

Tip 6: I asked questions

Before I went to my first interview I researched questions I should ask. I Googled questions to ask, I asked my mentor, and I went through my wish list to create a list of questions to ask at my interviews.

Here are few of my questions

  • What would an average week look like for me?
  • What tools do you use besides SSMS? (This was very important to me since I’m a Red Gate junkie.)
  • What are you looking for in an employee?
  • How many databases do you have and what are their sizes and versions?

On my second interview I asked: Is there anything that stands out that might make you think I’m not a good candidate? That was a hard question to ask. I actually read it to them. They were impressed. They had to think about it. After they answered it, they turned the table and asked me the same question. It turned out we had the same concern, I would be moving from a HUGE company down to a SMALL company. But, that is also an attribute we were both very excited about. They like the fact I like structure. It’s something they want to (slowly) move too. And I want to be at a small company. That was on my wish list.

Tip 7: I was picky

I did work with a couple of recruiters, but their attention span wasn’t the best. My mentor called them “a necessary evil” and I agree. Whenever they called, I would only be willing to set up an interview if the potential job met my wish list to my satisfaction. I also was picky about what time of day I was willing to meet for interviews since I still had a day job.  There was one job that looked perfect on paper, but they neglected to list DB admin duties and I was only interviewing for DB developer positions. They were really interested in me and I in them, but I stuck to my guns and passed on a second interview.

I’m glad I did, because my new job was worth the wait.

T-SQL Tuesday #44 – Whoa. Déjà Vu…It Happens When They Change Something

 

Bradley Balls  (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009.  The topic this month is Second Chances. Bradley has asked us to write about something we would like to have changed if we were given a second chance. I’m going to write about something that I actually was given a second chance to do differently.
Setting the stage


[Neo sees a black cat walk by them, and then a similar black cat walk by them just like the first one]
Neo: Whoa. Déjà vu.
[Everyone freezes right in their tracks]
Trinity: What did you just say?
Neo: Nothing. Just had a little déjà vu.
Trinity: What did you see?
Cypher: What happened?
Neo: A black cat went past us, and then another that looked just like it.
Trinity: How much like it? Was it the same cat?
Neo: It might have been. I’m not sure.
Morpheus: Switch! Apoc!
Neo: What is it?
Trinity: A déjà vu is usually a glitch in the Matrix. It happens when they change something.


The Matrix (1999)

First past

This is an ironic topic. Today is the last day at my current job and the topic I’ve chosen to write about has to do with one my first assignments four and half years ago. I first arrived at my company in January. Do you know what happens in January? The Sales Team has a new structure that needs to be applied to all of their reports as of yesterday. That particular year, they added a new layer to their hierarchy. The database model couldn’t handle it and neither could the reports. I proposed a new model using recursion, both in the database model and in the reports and it was approved. It proved to provide flexibility in the years to come. It had one flaw remaining though. It had maintained the current practice of assigning Clients to Sales People. That doesn’t sound too bad, until you know that when a Sales Person leaves, all of their Client records have to be updated… one by one by someone in sales. It also caused problems when there wasn’t a Sales Person available to assign to the clients right away.

Changing direction

This past January I had an opportunity to improve upon my original design. I simply changed directions. In the past, each Client had a Sales Person and each Sales Person had a Territory. Now, each Client has a Territory and each Territory has a Sales Person. If someone leaves, only ONE Territory record needs to be updated with a new Sales Person. If a new Sales Person is not available, then the Territory still shows up in the reports. This change was completely transparent to the report users.

I created a sample database model to show the relationships.

TSQLTuesday44 - DB Model

Data Model Created in Erwin

The cherry on top

The best part came a month after the new model was implemented. The Sales Team needed to have a single Sales Person represent different Territories in different Parent Territories. That was not possible with the old model. A Sales Person could only have one Territory, but with the new model it was possible… and it was already in place.

Thanks for all the fish

Thanks go out to Bradley Balls for hosting this months T-SQL Tuesday blog party. Please visit his website at http://www.sqlballs.com.

T-SQL Tuesday #43: Give Me a Key Lookup Operator for $1200, Please

My good friend Rob Farley (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009.  The topic this month is on Plan Operators that are used in execution plans to tell us how the Optimizer is going to run our query statements.

What is a Key Lookup Operator?

The Optimizer uses indexes to retrieve the fields needed from a table. If there are missing fields in the index being used, then the Optimizer has to go back to the Clustered Index to get the other fields. This has to be done for every row in the table. This action is noted in the execution plan by the Key Lookup Operator. The RID Lookup Operator is used instead of the Key Lookup Operator when a Clustered Index is not used and a Heap is used instead.

Show Me the Money

For my example I used the AdventureWorks2008R2 database. I ran the following query and looked at the execution plan.

SELECT
	c.CustomerID
   ,c.PersonID
   ,c.StoreID
   ,c.TerritoryID
   ,c.AccountNumber
   ,s.BusinessEntityID AS StoreID
   ,s.Name AS StoreName
   ,s.SalesPersonID
   ,st.Name AS TerritoryName
   ,st.CountryRegionCode
   ,st.[Group]
   ,st.SalesYTD
   ,st.SalesLastYear
   ,st.CostYTD
   ,st.CostLastYear
FROM
	Sales.Customer AS c
	JOIN Sales.Store AS s ON c.StoreID = s.BusinessEntityID
	JOIN Sales.SalesTerritory AS st ON c.TerritoryID = st.TerritoryID
WHERE
	c.StoreID IN (1284, 994, 1356, 1282, 992, 1358, 1280)

 

TSQLTuesday43 - KeyLookup

Two indexes were used to retrieve all the fields needed from the Sales.Customer table. The ix_Customer_StoreID index was missing the TerritoryID field so the Optimizer had to go to the PK_Customer_CustomerID Clustered Index to retrieve it. If you add the cost of both operators, then 66% of the cost of the query was used to retrieve fields from the Sales.Customer table.

For reference I removed the Clustered Index to show you what the execution plan would look like when a Heap is involved.

TSQLTuesday43 - RIDLookup

 

Since there was already a good index for the Customer table, I added the TerritoryID to the INCLUDE part of the index script. This turned the index into a covering index. A covering index is an index that contains all the fields from a table that are needed by a query statement. The INCLUDE part of an index allow extra fields to be part of the index, without the overhead of the data being sorted. Any fields that are part of predicates or filters should be part of the index, all other fields from the table should be part of the INCLUDE. Be cautious though, don’t throw the whole kitchen sink there. Those fields still take up space.

TSQLTuesday43 - Index

When I ran the execution plan again, I saw that the Key Lookup Operator was removed and the total cost to retrieve the fields from the Customer table was now reduced to 12%. This would also be true if the table was using a Heap instead of a Clustered Index.

TSQLTuesday43 - CoveringIndex

The Bottom Line

When you see a Key Lookup Operator or a  RID Lookup Operator, look to see if it makes sense to modify the corresponding index to be a covering index.

Thanks for all the fish

Thanks go out to Rob Farley for hosting this month’s T-SQL Tuesday blog party. Please visit Rob’s blog at http://sqlblog.com/blogs/rob_farley/default.aspx.

Shameless Plug for Grant Fritchey’s (b|t) Book: SQL Server Execution Plans, Second EditionYup, this is a shameless plug for one of my FAVORITE books. Grant did an outstanding job on this book. He goes through the majority of the execution plan operators, discussing what they represent. He also goes into how best to read an execution plan and how changing your query can affect the operators that are shown in the execution plan. I highly recommend adding this book to your collection if you don’t already have it.This book is available from Red Gate. You can download a FREE eBook, buy a hard copy from Amazon, or if you are lucky you can pick up a copy from a Red Gate event.

 

Using Set Theory Instead of ISNULL To Filter Data

I have written dozens upon dozens of reports for my users. I love the challenge of finding the data and wrangling it into a report in a way that doesn’t take a hundred years to run when they decide to return a year’s worth of data. One of the common requests that I get is to provide a parameter that will allow them to choose a single client or all clients. At first blush this is a very simple request and can be accomplished by using the ISNULL function.

ISNULL - ISNULL Code

Unfortunately there are performance implications using the ISNULL function. The ISNULL function is non-sargable. This means that when the function is used as part of a predicate, it can’t utilize the necessary index in an optimal way to filter the data. In this example, the Execution Plan shows that a full index scan was used to return one row. This equated to 80 Logical Reads for my dataset. This is the same number of Logical Reads whether one row was returned or all rows were returned. Eighty Logical Reads may not be that big a deal, but what about on a table that has hundreds of thousands of rows?

ISNULL - ISNULL Execution Plan
Execution Plan

ISNULL - ISNULL Reads 1
Statistics on the Table I/O Tab in SQL Sentry Plan Explorer Pro

 

An Alternate Universe

There is an alternate way of accomplishing this same request and it uses Set Theory to solve the problem. I can accomplish the same request by separating out the ISNULL logic into two different SQL statements and using UNION ALL to join the two result sets into one result set.  Even though I’m using two different SQL statements, only one will return data during the execution of the logic.

ISNULL - Union All Code

The first SELECT statement returns rows when the @ClientID variable has a value other than NULL. The second SELECT statement returns rows when @ClientID is NULL. By using UNION ALL instead of UNION we forgo the task of checking if there are duplicate rows in the second SELECT statement.

The Execution Plan now contains the ability to handle both cases, and the Logical Reads changes based on the value of @ClientID. There are still 80 Logical Reads when @ClientID is NULL because the whole table is being returned, but the Logical Reads are reduced to two when @ClientID is set to a single client number.

 

ISNULL - Union All Execution Plan 1
Execution Plan when @ClientID =7890

ISNULL - Union All Reads 1

 

Statistics when @ClientID = 7890
ISNULL - Union All Execution Plan 2
Execution plan when @ClientID IS NULL

ISNULL - Union All Reads 2
Statistics when @ClientID IS NULL

 

In this example I was able to use UNION ALL to filter on one or all clients, but it isn’t always the best solution. If the table being filtered fits on one page, then a table scan will occur regardless of how the predicate is setup. In that case, the ISNULL statement is easier to maintain so it would be the better solution.

Conclusion

It is always best to take a look at the Execution Plan of a SQL statement to see if there are any Index Scans on filtered data. If there are, you should take a look to see if there is another, set based approach to solve the problem.

T-SQL Tuesday #42 – My Journey on a Roller Coaster

Wendy Pastrick (b|t) is hosting this month’s T-SQL Tuesday blog party. The party was started by Adam Machanic (b|t) in December of 2009. The topic this month is on the Long and Winding Road.

Anticipation of the roller coaster

Today is T-SQL Tuesday, literally. I’m sitting in Mother’s Market Cafe, drinking my coffee and succumbing to the fact that I should be part of this month’s blog party. You see, I wasn’t going to write this month. I didn’t know what to say, but then I read Pat Wright’s (b|t) T-SQL Tuesday’s post and I was inspired to write about my journey, because it has been a tough one.

Getting on the roller coaster

I went to Cal Poly, San Luis Obispo to study Mathematics in the early 90’s. The first thing I did was sign up for a computer programming class because I knew it was going to be important, but I had no idea how important. My third year there I realized that I didn’t like my major. (Too much theory.) I struggled with the decision to study computer science or statistics. I took what I thought was the easier path, Statistics with a concentration in computer science. Pro tip: It was not the easier path, but I don’t regret it.

My first dataset I worked with was on a 10 year study of lobsters. I found that I liked data. I also found that I absolutely loved the one whole database design class that was available. I graduated in 1994 without a job. Why? Because jobs were scarce in the field of statistics where I was moving to and Dice and Monster had yet to be written. I did however have an opportunity to teach… computer programming. Specifically, Microsoft Visual Basic and I liked it.

The roller coaster goes up

As time went on I went from teaching to consulting. I liked programming, but I wasn’t in love with it. I did love writing SQL and I was good at it. I kept finding myself on the teams who wrote the business objects. Once in a while I would even get a say in the database design.

The roller coaster goes down

While I did enjoy writing business objects and SQL, I did not like the way I was treated by some of my colleagues and even a manager that I had. As my self-esteem was walked all over by these people, I got more and more dejected. After I had my second beautiful daughter I decided I was through with IT and that I would never return. (Ominous music played here.)

I went back to college to be a high school Mathematics teacher… And that didn’t work out too well for me, so I went back to what I knew best, computer programming. With my self-esteem low I took a position I was over qualified for. I met some great programmers at this new company and some real big jerks whom we shall not talk about. After three years I left the company.

The roller coaster banks left

I had always wanted to pursue being an artist and this seemed to be an opportunity to do so. I was able to pursue being a jewelry designer and a glass artist for three years before reality started to set back in that the private education my daughters were enjoying cost money. Money that my art was not bringing in. Pro tip: This is not the profession to get into during a recession. I had to go back… and I didn’t want to.

The roller coaster goes up

I refused to go back to being a VB.net programmer. What else was there? A friend of mine had a position at his company authoring reports. He needed someone to come in and convert old Crystal Reports to SSRS reports and to author new ones.  I had worked with both Crystal Reports and SSRS in the past so I accepted the job. The most amazing thing happened. I enjoyed my work and there wasn’t anyone tearing me down. In fact, there was someone on the team who took the time to rebuild my self-esteem and I am so happy he did.

The roller coaster goes faster

A year ago I realized that I wanted to learn more about writing better queries, data warehouse design, SSAS, and everything else about the SQL Server stack that I could absorb. I started attending a few conferences. I started looking for user groups and blogs to read. And the cherry on top was finding our SQL Family who encouraged me to start teaching again. This is the best ride ever.

Retrospective

How does this tie into technology? Well, it’s more about how we tie into technology. Different personalities tie into different professions. Twelve years of my career were spent working with the wrong technology. I shouldn’t have pursued computer programming for all those years, but database programming. Unfortunately, back then you had to do the database administration as well as the database programming and the administrators life is not for me.

As I move forward with my career I am eager to learn more about the SQL Server stack and to build relationships within the SQL Server community. I also look forward to how the SQL Server stack continues to change to the meet the needs of the world.

Thanks for all the fish

Thanks go out to Wendy for hosting this months T-SQL Tuesday blog party. Please visit her website at http://wendyverse.blogspot.com.

%d bloggers like this: