- Details
- Written by: Glenda Gable
- Category: Triggers
Recently there was a job that was changed to stop at a step earlier than normal (presumably for testing). When it ran that night, it didn't run all the steps. This job has a lot of dependencies which needed to be re-run the next day and caused a bit of chaos for a few applications, data warehouse data, and reports.
In order to see who made the change, there would have to be some sort of auditing running prior to when the change happened. Auditing of sql jobs and more had already been identified as a project for our department, but hasn't been prioritized yet. This of course, doesn't help identify what happened for this specific time. :)
In the mean-time, management wants to be able to track these types of changes. Therefore, I started researching the multiple options and found some common solutions:
- Details
- Written by: Glenda Gable
- Category: Powershell
An issue came up today where the vendor was sending a text file, but the text for some of the lines was on multiple lines, rather than a single line. After researching, I found out that I needed to compare a line of text and based on certain criteria append it to the prior line. I couldn't just strip certain end of line markers because it was mixed.
I am using a dummy test file in this example, but here is how everything played out:
Read more: Working with current and previous rows in text files
- Details
- Written by: Glenda Gable
- Category: Configuration/ Automation
End of Line characters include CR or LF. Windows uses both CRLF at the end of a line, whereas Unix uses only a LF.
- CR = Carriage Return
- LF = Line Feed
Recently, while troubleshooting why data wont import successfully as part of an automated process, I was pulling a subset of data out of the main text file, but the end of line markers weren't correct. I copied several lines using Notepad ++ and it automatically used CRLF markers. The automated process expected the end of line markers to be LF to be read by the SSIS package properly.
This article will help job my memory when I run across this again... but hopefully it helps someone else too! :)
First off, within Notepad ++ to see the end of line markers, you need to indicate you want to see them. Click on View > Show Symbol > then either Show End of Line, or Show All Characters if you want to see spaces and tabs, sometimes the second option is easier).
Read more: Using Notepad++ to change end of line characters (CRLF to LF)
- Details
- Written by: Glenda Gable
- Category: DBA Info
Jeffrey Verheul (blog | twitter) is hosting TSQL Tuesday. He thought of a great topic: SQL Family and community.
I first heard about the SQL community by way of the SQL Saturday in Baton Rouge (Aug 2012). I hadn't heard of anything like that before. I didnt have a twitter account and I hadnt been in a true DBA role, so when I wanted to learn more, I just thought of books. Once I went to the SQL Sat BR, it was like a whole new world opened up for me. I loved how the leaders interacted with each other, joking and kidding around - and honestly - I wanted that! I was a little too much in AWE of everything that day to approach anyone, but left with some awesome tips I had learned and couldnt wait for more.
I learned about their user group and even though I was an hour away, I wanted to attend. Life got in the way, so I didnt attend many in that next year. However, when 2013 rolled around, I decided to throw my hat in the ring and submitted to be a speaker. I had some previous experience with public speaking, so I was only nervous about the technical content portion. They chose me and after I presented, I realized I was completely addicted! I was thrilled to meet people who I had learned from the previous year. It was an amazing experience all around.
I have since spoken at the user group in Baton Rouge, this year's SQL Saturday in Baton Rouge (cant get enough of them), at 2 virtual chapters, and the SQL Saturday in Dallas (Nov 2013). Each time I have had the priviledge of meeting more and more people; each time I get a little awe-struck, and each time I feel relaxed and at ease when actually talking with them.
I have since started a user group here in Lafayette, LA (although I still travel to BR for meetings too). One of the main reasons I started the group was to have people closer to home that I could connect with in the same way I do with others throughout the community. It is amazing to know that even though I have a lot I need to learn, I havent met anyone who judges me negatively for that. In fact, it is quite the opposite. I dont know of any other IT community who responds to others in such an overall friendly way and I feel priviledged that I get to be a part of it.
P.S. - I mistakenly put someone else's name as the host for this month - and it turned into yet another example of how great this community is. I hadnt talked with him before, but I guarantee you when I meet him I will remember this and based on his messages, we will both chuckle in a nice way. :)
- Details
- Written by: Glenda Gable
- Category: Tools
First off, this script is still a work in progress. It is something that I created while working with data warehouse indexes, and added to it when I was getting ready for my presentation. It helps me specifically in the ETL process - I do NOT use this for overall index maintenance.
What is nice about this script is that I am able to either disable or rebuild by using the same stored procedure. That way, when I want to update the process, I only have 1 place to change it and all SSIS packages that use it will be affected.
The process I am using it in is as follows:
- Disable Clustered Index (this also disables all non clustered indexes and saves me from having to specifically do them manually)
- Rebuild the Clustered Index
- Data Manipulation (insert, edit, delete, etc)
- Rebuild all of the NonClustered Indexes
I am working on a full indexing strategy for the entire ETL process, but for now, I have specific needs and this takes care of all the problem children (tables) allowing the ETL process to continue largely unaffected, but these specific uses run much faster than without it.
By accessing or using this code, you agree to be bound by the following: license
If you have feedback, or are interested in contributing, please email:
- Details
- Written by: Glenda Gable
- Category: SSIS
This is going to be a post with more pictures than text, but it is something that is useful. :) I did not set this package up, it was done before I started working here. But, it has been asked about, so I am blogging about it.
When I release changes to the production, I always want a rollback plan (I would assume most people do). In order to do that for SSIS, I use a package to export all of the SSIS packages to files. Then, save those in a zip folder. That way, I can get to them quickly as needed. Here is the package info for those who can also use it. I will say that this only looks at 1 of the folders within the msdb structure - the main BI ETL folder (which has sub-folders and the SSIS packages in those sub-folders). The way this package works, I would have to have it in each of the SSIS solutions I use. I want to re-do this to make it more generic and re-useable for multiple solutions.
- Details
- Written by: Glenda Gable
- Category: Configuration/ Automation
This month's topic is: Automating Tasks and is presented by SQLChow (blog | twitter).
Automation - its a blessing and a curse. The curse part is that you want to do it, but who has the time? Of course, the blessing is it will save you time in the long run, but you have to put in the time up-front in order to gain that extra time. :)
I use SSIS daily for my job. I work with the ETL, so automating the data warehouse steps comes naturally. I try to ask each time I do something for someone, whether this is a 1 time thing, or if it will be repeated. I would rather automate something if I can.
Although I am sure this T-SQL Tuesday topic is about administrative type of processes that are automated, I think the best thing we have automated at my job recently, are a set of "checks" at a certain time to make sure everything is as it should be. We are only checking some things, but it has led me to do more. Right now, the checks make sure there is data in certain tables where it should be and it checks certain data in the warehouse to make sure it is within a 5-10% range of yesterday's totals. This automated check has really saved our bacon a couple of times. Before those checks were automated we had to look at a report daily to make sure the data looked ok.
So, back at the main topic. I have recently been learning PowerShell. It is a great tool to automate multiple servers in a single script. My latest automation is to update the DEV database from a copy of the PROD backup file. I get the latest backup filename and location, move the file from the production server to the development server, then restore that with different final data file locations, and finally update users and permissions. It was my first PowerShell script and thank you to Sean @ MidnightDBA for the help!
I know that this year (2014), will be spent learning more about database administration, and more automation scripts will come to be. It is exciting to know that I will feel more in control of the overall SQL environment when I have automated more of it.