Sunday, January 21, 2018

TSQL Tuesday #98

This month's T-SQL Tuesday is being hosted by Arun Sirpal (blog) - Your Technical Challenges Conquered. I love how Arun wants to jump in at the start of the new year and talk about something we have conquered... it really sets the tone for the year!!

While this isn't the most challenging thing I have ever done, it was (and still is) something I am proud of.  For my post, I am writing about a scenario that stretched my boundaries on planning things out well in order to come up with a robust solution.

My company provides visits to patients in their home, hospice care, or long term care in an acute care setting.  We have multiple facilities throughout the USA.  The vendor of one of our EMR systems creates a separate database for each facility.  Therefore, the vendor currently has 40+ databases for our company.  For reporting and analysis needs, we needed to bring a copy of the data into our data center.  Of course when the business asks for something, they want it right away.  The easiest and quickest way was to set up log shipping.  Discovery led us to realize that the end result would be used by the etl team for importing into the data warehouse, the reporting team for ad-hoc reporting, and the application development for various custom built in-house applications.  Therefore, on top of the log shipping, I also needed to make things easy for them to pull the data in a consistent way, without them having to keep up with new agencies and change connection strings, etc.

When thinking about the project as a whole, I knew I had the following requirements:

  • agencies would be added as we transition them from another software to this EMR
  • agencies would be added as we acquire new companies
  • vendor folder structure which held the backup files was a root folder and every database had its own sub-folder
  • handle issues that arise for one or multiple databases without interfering with the other database restore processes
  • report on the "up-to-date" date for each database when users ask (there will always be someone who doubts the data is up to date because they dont see something they expected)
  • combine/aggregate all data into a single database so that data can be queried in a consistent manner
  • encryption process needs to be included in order to have the files encrypted in transit (even though using SFTP) and at rest

After planning, then polishing, then throwing that away, then planning some more, then polishing some more, etc., my end result is something that flows fairly well.  I have a table which lists all of the databases, the folder where the log backups are, a flag to indicate "PerformLogBackups" and a flag to indicate "AggregateData".  I have a decent sized script which queries a list of all of the databases which need to be updated, copies the encrypted file off the SFTP server onto a file share (just in case...), and then moves the file to the SQL Server local drive.  Then, it decrypts all of the files, and loops through each database and restores each file in order.  Once all of the restores are completed, another job runs which aggregates the data into a single database for reporting needs.  Lastly, it cleans up the files and sends appropriate notifications.

The table which holds the list of databases solves the requirement of new agencies added easily.  It also solves the scenario if issues arise with some, not all databases, by toggling the flags appropriately.  If a certain database is having an issue, toggling the flag will stop it from being restored, but others will continue as normal.  Easy peezy... lemon squeezy...

I have checkpoints in the process, which will be covered as a topic in an upcoming post, and presentation.  Also, I still need to do more, just to make it even more well-rounded and hardy, but that is another reason I am writing about it today - it is still ongoing!  I love the idea of having something that works but that I can tweak and polish to make it better and better.  After all... is anything ever really done? :)

Write comment (0 Comments)

Near-Real Time Vendor Data On-Prem

The company I am working for has a copy of a vendor database on-prem for reporting and analysis.  The database is over 4 TB at this time, and constantly growing due to acquisitions and continued patient care.

Currently, our process for updating the data is log shipping, a tried and true method which has been successfully updating the database for years now.  Requirements are ever-changing, and there is a desire to have data that is updated more often during the day.  This article describes the current process, and explains how Availability Groups can be used to fulfill the requests.

Click here for a pdf that shows the current process, and both new processes, as explained in detail below.

Read more: Near-Real Time Vendor Data On-Prem

Write comment (0 Comments)

TSQL Tuesday #97

This month's T-SQL Tuesday is being hosted by Malathi (blog | twitter) - Setting learning goals for 2018

I have a feeling 2018 is going to bring a change in my work life and I would like to be prepared for that change.  Feeling confident in my skills will be a big part of how I get through that transition.  This post goes perfectly in line with some of the planning I have already started, and this gives me the chance to actually write it down (and therefore commit even more too)!

Learning Goals:

All of the learning I have done so far has been looking up this issue or researching a specific thing, etc.  I want to do some structured learning in 2018, so my goal is to be prepared at the end of Q1 2018 to take the 70-462 certification exam (Administering Microsoft SQL Server 2012 Databases).  I know that certification exams can be crammed for and don't always give a sense of a person's skills, but I am using them as they were intended, and it has helped me create and organize a learning plan.  I have more knowledge than I realize, but I am excited about tieing things together in my head for more of a well-rounded understanding.  Not knowing all the specific details and the underlying "why" behind the foundational topics that are normal for a DBA can sometimes cause me to hesitate.  This leads to me having a lack of confidence in my skills at key times.  Also, I don't always use the correct terminology even though my overall understanding is in place.

Read more: TSQL Tuesday #97

Write comment (1 Comment)

Using Triggers for auditing changes to sql jobs

Recently there was a job that was changed to stop at a step earlier than normal (presumably for testing).  When it ran that night, it didn't run all the steps.  This job has a lot of dependencies which needed to be re-run the next day and caused a bit of chaos for a few applications, data warehouse data, and reports.


In order to see who made the change, there would have to be some sort of auditing running prior to when the change happened.  Auditing of sql jobs and more had already been identified as a project for our department, but hasn't been prioritized yet.  This of course, doesn't help identify what happened for this specific time.  :)


 In the mean-time, management wants to be able to track these types of changes.  Therefore, I started researching the multiple options and found some common solutions:


Read more: Using Triggers for auditing changes to sql jobs

Write comment (0 Comments)

TSQL Tuesday #96

This month's T-SQL Tuesday is being hosted by Ewald Cress (blog | twitter) - Folks Who Have Made a Difference

There are 3 people who have really contributed positively to my career in data.  Two are in a dba role, the other wasn't anywhere close. They all helped in a different way, which is why I am listing them all.

The first person I want to talk about is a past boss.  When I was starting out in my career, I was an administrative assistant in the pharmacy department of an HMO.  The head of the department, Mary, was a pharmacist and very friendly, yet tough.  She worked with a committee of doctors to set policy within the HMO we worked for, and it affected all of the members we served throughout Florida.

I did the normal administrative assistant type of duties, but she allowed me to stretch beyond those borders and do other stuff that was helpful as well.  One of those pet-projects that I worked on imported prescription utilization data into an Access database, manipulated it, then exported the results in both Excel reports and Word documents.  I was so proud of what was done, and should have gotten the hint that was the direction I wanted to go in (but of course I didn't at the time).  I still think about her.  She helped me in more ways than I realized and was a very good mentor (and boss) to have!

The next person who contributed to my path was a dba I used to work with, Stuart.  When I realized I should actually be in IT, I found a job as a report writer for a local company.  I worked my way into being a developer, and then started working with databases more.  I would ask Stuart all kinds of questions and he was always patient.  I think he recognized that I wanted to learn, and that I wasn't questioning the decisions behind processes, but wanting to understand for knowledge.  He helped me learn the basics of things so that I can explore more on my own.  For example, he sat down with me and helped me understand how to prioritize the list of database objects in the FROM statement and why it mattered.  That jump started many nights of studying to figure out even more.  I still talk to him from time to time and hope to be a friend for many years to come.  He also always had great stories to listen to!

I admittedly don't know how to thank either of them for their patience and understanding.  The best thing I can think of is to help others in a similar manner.  I feel all gooey inside when I think about them and hope that I can have that affect on someone else one day. :)

The third person who I wanted to talk about is Sean McCown.  I met him at a SQL Saturday in Baton Rouge a few years ago where I attended a session he presented on Powershell.  That particular SQL Saturday was my first one ever.  I hadn't even known that type of event existed until that year.  The event was fantastic and I was so excited during and after (I have to give a shout out to the BRSSUG - they are awesome people)!  The way he presented the information and answered questions was something I hadn't really experienced before.  It was very clear that he enjoys what he does, and his upbeat personality draws people in.  I remember being in awe and wanting to do that too.  He also recorded himself, as he does with most of his sessions.  That was the first time I really had thought about and started researching what it takes to brand myself.  Him and Jen also put on a weekly webshow at DBAs@Midnight, which is an incredible way to talk to them regularly. Since then I have been a speaker at a few SQL Saturday events and user group meetings - partly in thanks to him. I still think back to the initial excitement and feel so thrilled that the SQL community is filled with so many wonderful people!



Write comment (0 Comments)

Working with current and previous rows in text files

An issue came up today where the vendor was sending a text file, but the text for some of the lines was on multiple lines, rather than a single line.  After researching, I found out that I needed to compare a line of text and based on certain criteria append it to the prior line.  I couldn't just strip certain end of line markers because it was mixed.

I am using a dummy test file in this example, but here is how everything played out:

Read more: Working with current and previous rows in text files

Write comment (0 Comments)

Using Notepad++ to change end of line characters (CRLF to LF)

End of Line characters include CR or LF.  Windows uses both CRLF at the end of a line, whereas Unix uses only a LF.

  • CR = Carriage Return
  • LF = Line Feed

Recently, while troubleshooting why data wont import successfully as part of an automated process, I was pulling a subset of data out of the main text file, but the end of line markers weren't correct.  I copied several lines using Notepad ++ and it automatically used CRLF markers.  The automated process expected the end of line markers to be LF to be read by the SSIS package properly.

This article will help job my memory when I run across this again... but hopefully it helps someone else too! :)

First off, within Notepad ++ to see the end of line markers, you need to indicate you want to see them.  Click on View > Show Symbol > then either Show End of Line, or Show All Characters if you want to see spaces and tabs, sometimes the second option is easier).

Read more: Using Notepad++ to change end of line characters (CRLF to LF)

Write comment (0 Comments)


Powered by mod LCA