Sunday, November 18, 2018

TSQL Tuesday #108

This month's T-SQL Tuesday is being hosted by Malathi Mahadevan (blog) - Non SQL Server Technologies. Mala wants you to share about learning things which arent SQL Server, but help to make us more well-rounded as the data world expands.

I don't know if this qualifies exactly, but I have been learning more about servers and hardware. I don't want to be a server architecture or administrator, but I can see how it helps to understand the hardware to do an even better job at knowing how sql responds.

It all started by using the monitoring tool Spotlight for SQL Server Enterprise from Quest. While there are multiple great tools out there, to me, this has a clean interface and is very helpful in seeing how the moving parts work together. I started wanting to understand all the metrics that the Quest team thought were important enough to visually show. Some of them I was familiar with such as sessions, blocking, cpu usage, and total memory, but there were others I didn't know about. Metrics such as buffer cache, page life expectancy, I/O speeds, etc., were not something I had not been exposed to.

Add to all of this, I had a co-worker at my last job who loved helping others learn and was happy to answer any question you threw at him. Saying he was the server guy doesn't do him justice, yet he still took the time to explain things as they came up. He opened my eyes to how memory coming off the north-bridge was going to be faster than the sticks of memory that are plugged into the motherboard, and why; or how and why the disk and raid configuration matters, along with many other things.

I cant say I know a lot of the hardware "stuff", it is a continual learning process for me. But, that is something that I believe helps me do a better job than I did several years ago. Also, even though a virtual environment doesn't see the underlying hardware in the same way, the concepts are still the same.  My next foray will be learning about the cloud - but that is a topic for another day. :)

Write comment (0 Comments)

Check AD members and groups

All of the companies I have worked for have individual AD accounts with permissions in SQL. They have all wanted to keep things easier by using AD security groups, but there are various reasons why that isnt always the reality, such as: software limitations, linked server config, legacy needs, changes impacting critical processes, etc..

I was recently introduced to the xp_logininfo stored procedure. It is a very handy tool to use to gather information about an AD account, or for user groups. Whenever someone asks for access, it is my go-to to find all of the groups they belong to which already have some permissions.  I still check AD to see if there are more appropriate groups (based on who else is in a group), but I thought I would share how this stored proc is helpful.

  • xp_logininfo 'ADaccountName', 'all'
    By including the AD account name, you can get a list of all of the AD security groups that user belongs to, which have permissions on that particular instance being queried. That last part is important. It isnt a list of all of the network groups that user belongs to - it is only those that already exist on that instance of sql. For example, if my account belongs to GroupA, GroupB and GroupC; the sql instance I am querying already has GroupB with permissions; then that will be the only result returned even though my account belongs to all 3 groups in AD.
  • xp_logininfo 'ADsecurityGroupName', 'members'
    By including the AD security group name, you can get a list of all of the AD members who belong in that group. This is an easy way of querying to get a list of everyone who has access.

 

Write comment (0 Comments)

TSQL Tuesday #105

This month's T-SQL Tuesday is being hosted by Wayne Sheffield (blog) - Brick Wall. He wants you to share about a time when you ran up against your own brick wall, and how you worked it out or dealt with it.

I find that, for me, most of the technical reasons for hitting a wall in a project have been solved by asking others for help or researching on my own. I havent come across a wall that I havent been able to climb or go around - so far (I am knocking on wood as we speak).

My professional and personal goals, however, are a different story. At the start of this year, Doug Purnell (blog) and I became accountability partners. I wrote a blog post about it, which you can read here. It was actually due to the Dec 2017 #tsql2sday blog chain! :)

Things were going well for me, I landed a new job which got me out of a bad environment, and was settling in ok. Unfortunately, in early June I felt like I hit a wall. I missed one of our bi-weekly meetings due to an unexpected family thing and to me it felt like a big thing. In that call, we had scheduled time with Grant Fritchey to answer some questions and pick his brain. I have started an email to Grant to apologize for missing that meeting and ask to reschedule, but I havnet sent it yet. I feel bad about missing, and this is where the wall comes in: I am beating my head against it waiting to bust through before I send it, which is silly because I know it doesnt work that way.

If it weren't for Doug and our continued meetings and communications, I dont know if I would be on track right now. If I am just setting goals and noone else knows about them, its easy to slide for a month with no momentum. But, since I have someone else to keep me going, it makes it harder to not move at all. Doug recently told me he was re-evaluating the list of goals that we originally set for ourselves at the beginning of the year, and that made me feel like I had permission to do the same. Its silly, but that really helped and made me feel relieved - it helped release a lot of the pressure I was putting on myself.

In summary, I think I am still at the wall today, but am on the track to finding my way around it, thanks to the help of Doug. I think we set ourselves up for success by finding each other when things were going well, so that we have a system in place for when things got messier.

I am very grateful for this topic this month... it really made me stop and think. I am going to send the email to Grant today so that I can stop beating myself up for not doing it yet.

Write comment (0 Comments)

TSQL Tuesday # 104

This month's T-SQL Tuesday is being hosted by Bert Wagner (blog | twitter) - Code You Would Hate To Live Without. He wants you to share about the code that you would hate to live without, whether yours or someone else's.

I know there will be a lot of wonderful posts about this topic! I have 2 things that stand out as scripts I love.

  • I changed a stored procedure which used a cursor and ran for 45 mins, into a set-based query that runs in less than 5 mins. While I know I am stretching the topic just a little, it is something I think about often, very proudly. This was something that helped cement the idea that I was on the right path in my career. I knew I loved working with data and databases, but I remember the level of excitement I had when I got it to work. :)

 

  • The other script is something I put together which works with log shipping for multiple databases. I made a robust, automated process from start to finish. There are still improvements I would make to it, but I dont work for that company anymore, so I probably wont have a chance in production for a while. The vendor had multiple databases (approx 45) and that number would continue to grow as we added more businesses. Therefore, the solution had to be dynamic, nimble and easy to maintain. I am proud of the documentation (ohhh... I said a bad word), workflows and full process with error handling, I set up. It was also easy to troubleshoot and only run one or a few databases at a time if only some were lagging behind.

While neither script is a general purpose script (they are specific to the company I worked for at the time), they both make me feel confident in my abilities as a DBA. I know that I can keep at something to make it eventually bend to my will (muwhahha), and enjoy myself along the way.

Write comment (2 Comments)

Learning new stuff - quickly and accurately

I think one of the most helpful abilities to being an exceptional DBA is to be able to learn something new, fairly quickly and yet accurately.

I recently received a request to add an assembly to a SQL instance. I knew that functionality was there, it allows more flexibility within SQL, but I havent had any dealings with it yet. I was provided a script that had the hex value of the assembly and code to set the permissions equal to EXTERNAL_ACCESS.

Its going to sound weird, but I am glad that script threw an error because I have learned (and am still researching) much more about this. I learned that there are a lot of posts that provide code on how to set the trustworthy property of the database. Some of those posts point out that isnt the best method and explain why. Then, if you keep looking, there are a few posts which go into depth as to why.

In this case, I am having to work with the App Dev department to get the assembly signed. Since I like being able to feel confident in my responses to the developers, I need to understand more about what is going on than just the quick and easy answer. All this research has taken up part of my day today, and the testing will probably consume most of my tomorrow.

I point this out because earlier in my career, I would have stopped at the easy answer (which has the most posts). Now, though, I am looking to find those fewer posts which explain the why, in depth, so that I learn everything about what is going on, not just do something because its easy. I feel proud of my progression, and sometimes we need to recognize little wins like this. :)

This is just a quick post while SQL is installing. I am spinning up an instance of SQL in order to play around with installing an assembly, the different permission sets associated, and what is appropriate in the different environments (DEV and PROD) since the assembly already exists in one and not in the other. Back to the fun!

Write comment (0 Comments)

KILL spid (WITH STATUSONLY)

This article talks about how to stop or terminate a process that is running. Stopping a process shouldn't be the first step when troubleshooting as it can possibly make the situation worse.  However, it can sometimes be helpful in clearing blocking chains or other performance issues.  

In order to know what to stop, you will need to know what processes or queries are running.  There are a variety of ways to see all of the processes or queries running right now, including the following:

  • Microsoft built in tools - sp_who and sp_who2
  • Adam Mechanic's tool - Who is Active (Click here to download it)
  • Hand made query using the sys.dm_exec_requests and/or the sys.dm_exec_sessions tables

 

KILL

Once you know the session ID (spid) of the query or process you want to stop, you can do so by using the KILL command.  The kill command is pretty straight forward. The syntax is simply: KILL spid  (replace the text "spid" with the number of the process desired to be stopped)

A word of caution here, there are times when it is best not to kill a session. One example is if a process has been running for a while and using multiple threads, when it is killed, the rollback operation will only be single threaded. Therefore, the rollback could take much longer than letting the process finish on its own.  All of the same performance issues and/or blocking will still continue, so it actually could things worse, not better.

Some processes are not able to be stopped: per Microsoft, "System processes and processes running an extended stored procedure cannot be terminated."

 

UNIT OF WORK ID

The kill command can also be used to stop or terminate orphaned distributed transactions. The GUID value that is the Unit of Work ID (UOW ID) can be found in either the error log, the MS DTC monitor, or the sys.dm_tran_locks dmv.

The syntax is simply: KILL uowid

 

WITH STATUSONLY

While I knew about the kill command, I wasnt aware that you are able to check the status of the rollback process with the WITH STATUSONLY option until semi-recently.

The syntax is simple KILL spid WITH STATUSONLY - or - KILL uowid WITH STATUSONLY

For processes that are in the middle of being rolled back, this will show the status of the rollback. This option doesn't "do" anything, it just shows the progress. Keep in mind that as with other Microsoft provided durations, it isn't an exact number. This is more of a ball park figure that might be semi-right, but may also be a bit off from the actual time.

If you run this option on a spid that isnt part of a rollback, it wont hurt anything. You will just see an error returned indicating a rollback isnt in progress.

 

 

Its funny, but whenever I have to kill a process (which is thankfully rare), I cant help but do a bad-guy cackle in my head... lol!

 

Write comment (0 Comments)

My Goals for 2018

When Doug Purnel (blog) and I became accountability partners, we had to define our goals and share them in order to keep each other accountable.  I wrote out my goals, but hadnt shared them until now (its March).  One of my goals was to find a new job and when I submitted 1 application in January, I didnt expect it to be the job I accepted (and then move out of state and start my new job all within 1 month)!

Now that things have calmed down, I am publishing my goals to everyone.  I will update this main post with links to the other blog posts I create as I complete these goals.

Read more: My Goals for 2018

Write comment (0 Comments)

Archives

Powered by mod LCA