Deprecated: explode(): Passing null to parameter #2 ($string) of type string is deprecated in /homepages/8/d352291751/htdocs/sql313/site/libraries/vendor/joomla/application/src/Web/WebClient.php on line 406
SQL313
   

tsqltues

  • TSQL Tuesday #59

    This month's T-SQL Tuesday is being hosted by RealSQLGuy - Tracy McKibben (blog | twitter) and he is calling out for all Heroes!

    What a fabulous topic! I really had think about it though. I think I have 1 personal hero and 2 professional heroes. I should give a disclaimer here - I have a hard time picking a favorite anything, so this just follows suit I guess. :)

    First, my professional heroes (not listed in any specific order):

    • Stuart - I used to work with Stuart at my previous employer. I didnt think of him in this regard at the time, but looking back I do. He was the DBA, and I was more of a programmer/database developer. Every time I had a question, he would explain why and I would walk away feeling smarter, not dumber. I have worked with people who made you feel like an idiot when you asked a question, and I never had that from Stuart. In fact, he would explain in depth if I asked. For instance, he taught me tricks on query optimization, rather than just doing it and giving it back to me with no explanation. I remember a specific moment at my current job: it was the first time I changed a 45-min cursor within our ETL process to a 5-10 min set based query. I was ecstatic! I dont think I would have thought of it in that way without working with Stuart before. In all honesty, when I think of a DBA, he is what I think about. I work with the BI side of things now, so it is different than my work then, but I wouldnt have the confidence I do now without that experience.
    • Chris - I used to work with Chris at my previous employer, too. He is the type of person that is very smart, down-to-earth and nice. He also speaks code fluently as another language (or he at least seems to). :) He is my hero because if it werent for him, I would probably not have realized that I am not a programmer. Sure - I can do it. But, watching him be able to understand and code so quickly and easily, I realized that is not my natural-born skill. For me, I have to look up syntax and really dig into a project because it didnt come as easily for me. The hero part comes from him being such a great guy about it. He also took the time to help you without making you feel stupid in any way. How many people do you know that are so good at what they do, but dont have the ego behind that? I am the type that feels heavily responsible for what I am doing, which sometimes means I keep grinding at a project or multiple until I feel like it is perfect - and it never is. When I realized I am not a programmer, it made me think of what I can do that easily - and that is what led me to working with data much more.

    I have saved the best for last - my personal hero:

    • My Mother - She is kind and sweet and everyone loves her! She is what I want to be (when I grow up - haha). I have had some very challenging years and I dont know how I would have gotten through them without her. I am a positive person, but she helps me think in that way when I really dont want to. Maybe I have someone at work who doesnt like me and it shows - I take things personally and next thing you know I am not thinking happy thoughts, much more hurt. She helps me get back in the right mind-set and not to worry about it. She tells me - I need to let it slide like water slides off a duck's back. I will admit, I hate hearing her say that - but she is right and I know it. :) I also like knowing she relies on me as much as I do her - so its not all 1-sided. :) I am lucky enough to be able to say my mother is my best friend and is definitely a super-hero!
  • TSQL Tuesday #98

    This month's T-SQL Tuesday is being hosted by Arun Sirpal (blog) - Your Technical Challenges Conquered. I love how Arun wants to jump in at the start of the new year and talk about something we have conquered... it really sets the tone for the year!!

    While this isn't the most challenging thing I have ever done, it was (and still is) something I am proud of.  For my post, I am writing about a scenario that stretched my boundaries on planning things out well in order to come up with a robust solution.

    My company provides visits to patients in their home, hospice care, or long term care in an acute care setting.  We have multiple facilities throughout the USA.  The vendor of one of our EMR systems creates a separate database for each facility.  Therefore, the vendor currently has 40+ databases for our company.  For reporting and analysis needs, we needed to bring a copy of the data into our data center.  Of course when the business asks for something, they want it right away.  The easiest and quickest way was to set up log shipping.  Discovery led us to realize that the end result would be used by the etl team for importing into the data warehouse, the reporting team for ad-hoc reporting, and the application development for various custom built in-house applications.  Therefore, on top of the log shipping, I also needed to make things easy for them to pull the data in a consistent way, without them having to keep up with new agencies and change connection strings, etc.

    When thinking about the project as a whole, I knew I had the following requirements:

    • agencies would be added as we transition them from another software to this EMR
    • agencies would be added as we acquire new companies
    • vendor folder structure which held the backup files was a root folder and every database had its own sub-folder
    • handle issues that arise for one or multiple databases without interfering with the other database restore processes
    • report on the "up-to-date" date for each database when users ask (there will always be someone who doubts the data is up to date because they dont see something they expected)
    • combine/aggregate all data into a single database so that data can be queried in a consistent manner
    • encryption process needs to be included in order to have the files encrypted in transit (even though using SFTP) and at rest

    After planning, then polishing, then throwing that away, then planning some more, then polishing some more, etc., my end result is something that flows fairly well.  I have a table which lists all of the databases, the folder where the log backups are, a flag to indicate "PerformLogBackups" and a flag to indicate "AggregateData".  I have a decent sized script which queries a list of all of the databases which need to be updated, copies the encrypted file off the SFTP server onto a file share (just in case...), and then moves the file to the SQL Server local drive.  Then, it decrypts all of the files, and loops through each database and restores each file in order.  Once all of the restores are completed, another job runs which aggregates the data into a single database for reporting needs.  Lastly, it cleans up the files and sends appropriate notifications.

    The table which holds the list of databases solves the requirement of new agencies added easily.  It also solves the scenario if issues arise with some, not all databases, by toggling the flags appropriately.  If a certain database is having an issue, toggling the flag will stop it from being restored, but others will continue as normal.  Easy peezy... lemon squeezy...

    I have checkpoints in the process, which will be covered as a topic in an upcoming post, and presentation.  Also, I still need to do more, just to make it even more well-rounded and hardy, but that is another reason I am writing about it today - it is still ongoing!  I love the idea of having something that works but that I can tweak and polish to make it better and better.  After all... is anything ever really done? :)

  • TSQL Tuesdays # 42

    It is time for TSQL Tuesday - Wendy Pastrick (blog | twitter) is hosting this month and the topic is "The Long and Winding Road". Wow, this is a great topic - I cant wait to read many of the other posts over the course of this week! :) This is my first time writing for TSQL Tuesdays, and I cant think of a better way to start.

    To start off with, computers have been a part of my life since I was "knee high to a grasshopper". I remember playing on the ATARI and the Commodore. That was back in the days when you put the cartridge for Frogger in, went to have dinner, and then by the time you finished, it was loaded and you could play. I also remember when the game BOMB SQUAD came out and it TALKED!! Holy smokes it was totally AWESOME at the time! :) (I don’t do a good impression of those words, although I try, but they are definitely etched into my brain for all time.) I remember thinking, at that young of an age, how smart and creative you would have to be to come up with stuff like this. I dabbled in basic for a bit, and I am proud to say I wrote my first code at a tender young age - I could print on the screen "Hello Dad, this is cool!" and make it repeat X number of times.

    Skip ahead a few years (ok many… but who is counting?) - I was an administrative assistant at an HMO in Florida. I always had a knack for picking up on how software worked by playing around, and as an Admin Asst, it came in handy to be able to help others. I got in good with the IT guys, I mean after all, you should always make friends with IT. That is the first time I was introduced to databases. I worked in the Pharmacy department and we had a vendor that tracked the utilization of medications for our members that we needed to better analyze. So, I used the tools I knew, Microsoft Office and came up with a fabulous solution. I imported the csv file they sent monthly into an Access database (please don’t cringe, the story gets better - I promise :)). Then, scrubbed and manipulated the data, and finally spit it out to Excel. Excel took over and I ran a few macros, and voila - the end result was a formatted report that was used as is. Everything was automated and I LOVED it!

    I should have taken the hint even back then, but it took several years and a few jobs in between for me to figure out my passion - data and database management. Right now, I work with SQL 2008 R2, but I have worked with a number of database platforms (MS Access, FoxPro, Visual FoxPro, mySQL, SQL Server 2000/2005/2008/2008 R2 and now delving into 2012). I have loved the intricacies of each, but like working with SQL Server most. I am in a 2.85 person BI team, and while I think what they do on the analysis side is cool, nothing compares to digging my hands into the database directly. I like giving accurate data into the hands of my colleagues and letting them do whatever they wish with it. I also love SSIS! I don’t know what I would do without it at the moment.

    While SQL Server is what is taking me to new places right now, I have to be honest. I cherish the memory of when I made my first integrated BI app using an Access database and Excel macros. I don’t know where I would be without that experience. I truly hope to have that kind of moment in SQL Server. I am studying to get my certifications in SQL Server 2012. Hopefully along the way I will have a chance - with such interesting characters in this industry, it is probably almost guaranteed! :)

  • TSQL Tuesdays # 45

    It is time for another TSQL Tuesday - Mickey Stuewe (blog | twitter) is hosting this month and the topic is "Invitation–Follow the Yellow Brick Road". Well, here it goes..

    I worked for a medical device company that needed to audit quite a bit of data. I was lucky enough to work with some wonderful and talented people. One of those persons was the DBA. I took what he did and said as database gospel. He was the type of person who would help you understand things if you asked - and if he didn't know he would tell you, but get back to you later with an answer. He probably got sick of seeing me as often as he did. :) When I started at that company, I knew that I had enough understanding to be dangerous, but I also knew full well and good that I had a LOT to learn. I took the opportunity to learn from him.

    He set up a system of auditing using triggers on MANY tables in the transactional database. I started learning everything I could about using the different types of triggers (insert, update, delete) and the way to use the system tables to put the data here and there, and to record what happened and by whom. I ended up learning a lot of "magic behind the curtain" about SQL and triggers along the way.  When I was hired, the database used was SQL 2000 (we later went through a couple of upgrades - the validation was a lot of work). I never even thought about other options for auditing - this solution worked well, why wouldn't everyone use it. :)

    When I started at my current job, I realized my passion for SQL. I knew I liked it before, but this job has really given me an opportunity to get my hands dirty much more. That is when I was introduced to the ETL and the data warehouse. I had been introduced to the concept of a warehouse before, so that wasn't new, but I hadn't been involved in the manipulations and loading of data.

    In doing research about ETL in general, I learned about Change Data Capture (CDC) and Change Tracking. I have to admit, I was initially a little bummed. I had learned something that was working very well, but here was this new functionality that had the same end result, just worked differently. I eventually learned quite a bit more about it and was thrilled with the possibilities. I wanted to start using CDC with the ETL to help with incremental loading. I haven't gotten my chance and might not be able to since we are phasing out the only data source that we house internally - all other data comes from outside vendors by either files or database backups. Therefore, I put the thought of auditing on the back burner.

    Recently, however, we had a need on the application development side, to audit who made certain changes. BINGO - perfect timing - I had already done the research previously, so I knew how to implement things to get what we needed up and running pretty quickly. Of course, as everything else, other priorities have slipped in, but I am almost done with the implementation. Setting up the CDC portion was a snap, getting the data where I want it so I can use it is what is taking more time. I am taking the audited records and putting the changes into an XML string then saving that in a central repository for audit logging and reporting (but yet keeping it dynamic so I can have 1 script no matter what the data structure of the audited table looks like).

    In summary, I learned a lot from looking into what made the first auditing process I was exposed to "tick". Then, I learned more about changes in functionality between SQL 2000 and SQL 2008 R2.

    I didn't know about the SQL community until a year ago (Thank you - SQL Saturday in Baton Rouge 2012). I have met some wonderful people so far and look forward to meeting even more! All in all, it is really awesome to say that while I don't have the "admin" type of historical knowledge a lot of people do, the auditing topic is something where I can say I understand the before and after.

    Thank you Mickey for such a great topic - and talk to you soon!

  • TSQL Tuesdays # 47

    This month's topic is all about SWAG. Kendal Van Dyke (blog | twitter) is hosting this month and wants to know "the story about the best SQL Server swag you've ever received." This is definitely a topic that will have some fun answers. :)

    I know there will be several people this month who will have to really think about what the best SWAG they have received is. They will have to look at several things before making a decision. I am not in that predicament - not yet anyway. Of course, if any vendors are reading this, feel free to send me some stuff to put me in that position! :)

    I have received exactly 1 thing so far that is my favorite (I have actually only received 2 things, so my choice was not hard). During TechEd 2013, there were daily A&Q themed twitter posts from Microsoft. I participated each day - and I won! I have never won anything, so it was a lot of fun. The prize was a USB power stick. I had never heard of these things before, so I read the instructions that came with it. It is an 8GB flash drive, and also keeps a charge that you can use to power other electronic devices. At the time I got it I wasnt sure where I would use it, but there have been 2 specific times it has come in really handy.

    1. I went on a trip with my son this past summer. We went to the easternmost point of the US (up in Maine), saw a couple of waterfalls, and went through Kentucky and Florida before coming home to Louisiana. I brought it with me and it was a good thing. We went camping (with a tent and sleeping on the ground - real camping stuff) and I was able to still use my ipod touch because of it.
    2. On the same said trip, while in Kentucky, my friend/coworker, Joe, and I went to a music festival. We got to see Willie Nelson (woo hoo!). Joe didnt have his phone charged - voila! - I was able to help him out. Without that, he wouldnt have gotten pictures of Willie Nelson. (I just had to say his name again - it was sooo exciting and I was up standing up front real close).

    Pic coming as soon as I get home! :)

    So, my favorite SWAG helped me help a friend - and he got more pictures of Willie Nelson. I am sure there will be someone who can beat that - I am excited about reading the other posts. Thanks Kendal for hosting this month!

     

    EDIT: 10/8 9:26 PM CST - here is the picture! :)

     

  • TSQL Tuesdays # 50

    This month's topic is: Automating Tasks and is presented by SQLChow (blog | twitter).

    Automation - its a blessing and a curse. The curse part is that you want to do it, but who has the time? Of course, the blessing is it will save you time in the long run, but you have to put in the time up-front in order to gain that extra time. :)

    I use SSIS daily for my job. I work with the ETL, so automating the data warehouse steps comes naturally. I try to ask each time I do something for someone, whether this is a 1 time thing, or if it will be repeated. I would rather automate something if I can.

    Although I am sure this T-SQL Tuesday topic is about administrative type of processes that are automated, I think the best thing we have automated at my job recently, are a set of "checks" at a certain time to make sure everything is as it should be. We are only checking some things, but it has led me to do more. Right now, the checks make sure there is data in certain tables where it should be and it checks certain data in the warehouse to make sure it is within a 5-10% range of yesterday's totals. This automated check has really saved our bacon a couple of times. Before those checks were automated we had to look at a report daily to make sure the data looked ok.

    So, back at the main topic. I have recently been learning PowerShell. It is a great tool to automate multiple servers in a single script. My latest automation is to update the DEV database from a copy of the PROD backup file. I get the latest backup filename and location, move the file from the production server to the development server, then restore that with different final data file locations, and finally update users and permissions. It was my first PowerShell script and thank you to Sean @ MidnightDBA for the help!

    I know that this year (2014), will be spent learning more about database administration, and more automation scripts will come to be. It is exciting to know that I will feel more in control of the overall SQL environment when I have automated more of it.

  • TSQL Tuesdays # 57

    Jeffrey Verheul (blog | twitter) is hosting TSQL Tuesday. He thought of a great topic: SQL Family and community.

    I first heard about the SQL community by way of the SQL Saturday in Baton Rouge (Aug 2012). I hadn't heard of anything like that before. I didnt have a twitter account and I hadnt been in a true DBA role, so when I wanted to learn more, I just thought of books. Once I went to the SQL Sat BR, it was like a whole new world opened up for me. I loved how the leaders interacted with each other, joking and kidding around - and honestly - I wanted that! I was a little too much in AWE of everything that day to approach anyone, but left with some awesome tips I had learned and couldnt wait for more.

    I learned about their user group and even though I was an hour away, I wanted to attend. Life got in the way, so I didnt attend many in that next year. However, when 2013 rolled around, I decided to throw my hat in the ring and submitted to be a speaker. I had some previous experience with public speaking, so I was only nervous about the technical content portion. They chose me and after I presented, I realized I was completely addicted! I was thrilled to meet people who I had learned from the previous year. It was an amazing experience all around.

    I have since spoken at the user group in Baton Rouge, this year's SQL Saturday in Baton Rouge (cant get enough of them), at 2 virtual chapters, and the SQL Saturday in Dallas (Nov 2013). Each time I have had the priviledge of meeting more and more people; each time I get a little awe-struck, and each time I feel relaxed and at ease when actually talking with them.

    I have since started a user group here in Lafayette, LA (although I still travel to BR for meetings too). One of the main reasons I started the group was to have people closer to home that I could connect with in the same way I do with others throughout the community. It is amazing to know that even though I have a lot I need to learn, I havent met anyone who judges me negatively for that. In fact, it is quite the opposite. I dont know of any other IT community who responds to others in such an overall friendly way and I feel priviledged that I get to be a part of it.

    P.S. - I mistakenly put someone else's name as the host for this month - and it turned into yet another example of how great this community is. I hadnt talked with him before, but I guarantee you when I meet him I will remember this and based on his messages, we will both chuckle in a nice way. :)

Thank you for visiting!