Why DBAs (and IT Pros) should cultivate a habit of mindfulness

I’ve always believed that while technical skills are incredibly important to our success as IT professionals, it’s also true that there are a number of key proficiencies that we must acquire if we are going to advance in our career. Some time ago, I wrote a piece about how “learning how to learn” was one of these; after all, if we can’t acquire and retain useful knowledge, then surely we won’t be able to improve our lot in life. Today, I want to introduce another of these keystone soft skills as I’ve come to call them: mindfulness.

What is mindfulness exactly? I like this definition from the U.C. Berkeley Greater Good site:

Mindfulness means maintaining a moment-by-moment awareness of our thoughts, feelings, bodily sensations, and surrounding environment.

Mindfulness also involves acceptance, meaning that we pay attention to our thoughts and feelings without judging them—without believing, for instance, that there’s a “right” or “wrong” way to think or feel in a given moment. When we practice mindfulness, our thoughts tune into what we’re sensing in the present moment rather than rehashing the past or imagining the future.

To put it another way, mindfulness helps us disentangle ourselves from the endless chatter in our minds. When we pause, take a few breaths, and focus on our breathing, we can notice all sorts of things flowing through our heads. Words, phrases, and whole stories float by in waves, coming and going like the tides of the great oceans. Sometimes these stories are good and useful, such as when we’re pondering a pleasant memory of time spent with loved ones, or thinking about a particularly knotty problem we’re trying to solve at work. But in many other cases, the thoughts in our mind can be downright harmful, especially if we get caught up in them. To continue the analogy of ocean tides: while it can be great fun to play and swim in the waves, it’s also true that if we’re not careful, we can find ourselves swept away by the powerful currents, suddenly becoming conscious that we’ve been pulled out to sea.

Some of the very same traits that make us good IT professionals can also harm us in this way. For example, I’ve found that the best technology workers tend to have an almost obsessive quality about them. They are meticulous and detail oriented when it comes to their work; they insist on doing things in pre-ordained and methodically planned ways, because that’s how you can get consistent results time and time again. They are also good at taking a problem, breaking it into smaller components, and chewing them up in the minds, ruthlessly chipping away until the problem is resolved. And believe me, until the obstacle is beaten, they won’t let it out of their minds.

Now, would you say that the traits I’ve just listed are positive ones? Really stop and think about it for a moment. Notice what comes into your mind, without judging it or questioning it. Focus on it for a few minutes, mulling it over. Then, write down what you noticed yourself thinking.

Chances are, if you’re like me, your reaction was largely positive. After all, these skills and tendencies are probably one of the reasons we are successful in our careers, and that’s undoubtedly a good thing. But at the same time, there’s another side to this, one that isn’t quite so positive. Let me tell you a story to illustrate.

Not so long ago, I was working through some rather tough issues at work. We were in the midst of a rapid and rather complex deployment in the Microsoft Azure cloud, all of which was completely new to me. We were under tight deadlines, with a great deal of pressure to get everything up and running successfully. As with most new endeavours, we encountered numerous problems and obstacles that needed to be either circumvented or resolved.

Now my brain is a fantastic piece of machinery when it comes to this sort of thing. I love challenge as much as anyone, especially when it comes to novel areas of technology. So naturally, this work dominated my thinking almost constantly, even when I was not at work. I would find myself caught up in trying to figure out solutions while in the shower (Remember that cliche about our best thinking happening there? It’s totally true.), or while working out at the gym, or even while spending time with my family in the evenings.

One night I was deep in thought, researching things on my phone while getting the kids ready for bed. I was sitting in the bathroom while the younger of my two kids played in the bath tub. I could tell I was close to a breakthrough; the pieces were slowly fitting together in my mind, moving ever slowly towards a coherent picture. It felt wonderful.

Then, suddenly, a small voice penetrated my deep thinking: “Daddy, will you please come play with me?” It was my daughter, who was looking up at me with the typical wide eyed expectant look on her face.

I started to tell her that I was busy, but that I would play with her soon. But then, I stopped for a moment, pausing before any words left my mouth. I could tell I was about to respond in my usual automatic way; perhaps it’s my ADD riddled brain, but I have a rather pervasive tendency to speak without a lot of thought. But something in me also called out: “Wait, breathe, and be aware: what am I experiencing in this moment?”

I let my attention fall to my breathing, slowing it to a measured, purposeful pace. I noticed how it felt as my stomach rose and fell gently, how the air felt moving all the way from my nostrils down into the deepest part of my lungs. Then, after a few breaths, I noticed my thoughts floating in front of me: “I need to figure this out. I’m so close! Why does she have to bother me right now, all I need is a little more time.” I felt the powerful pull these exerted on me, urging me to ignore or otherwise dismiss her request.

But then I asked myself a simple, yet powerful question: “What’s the most important thing I can do, right here, right now?”

The answer came easily: spend time with my daughter. She is still in that magical (and at times incredibly frustrating) age where children look to their parents with almost magical reverence. Opportunities like this are precious, and I know that there will come a day when she no longer wants desperately to play with her Dad.

So what did I do? I put my phone face down on the counter, got down next to tub, and played with her. I let all my attention be directed towards her and me, just being fully present in the moment. From time to time, thoughts about that pesky work problem crept back into my mind, trying to worm their way in and disrupt my focus on playing with her. And when they did, I simply noticed them, said “Gee thanks for that thought mind, I’ll get back to you,” and went back to what I was doing.

That, folks, is the real power of mindfulness. It would have been so, so easy to get caught up in the tide of my mind, swept off, and totally miss that precious few minutes with my child. Instead, by bringing the gentle (yet oh so powerful) strength of awareness to bear, I was able to escape that path, and focus on what was truly important to me in the moment.

If that sounds awfully “woo-ish” to you, you’re not alone. I too once believed that meditation and mindfulness were largely some kind of mythical practice, mostly done by monks sitting in remote monasteries in the mountains. But over time I’ve come to see that as nothing more than a false first impression. In fact, mindfulness as a practice can be completely secular (if that’s your sort of thing, for some the spiritual elements may in fact be an attractive point), and increasingly has a great deal of scientific research behind it. For example, studies have shown that mindfulness meditation can lead to fundamental changes in the structure of our brains, reduce recidivism among inmates, and (perhaps importantly for us) reduce stress and increase resiliency in the workplace.

So how might we actually get started in the practice of being mindful? Simple: pay attention. Mindfulness is really nothing more than focusing our attention on what is happening in the present moment, both externally as well as internally. In fact, if you did the exercise earlier in the post where I asked you to notice what you were thinking, congratulations, you’ve practiced being mindful!

Simple? Yes. Easy? Not in the least! Our minds are so very good at pulling us out of the moment, into a state of anxiety or rumination. Even the best meditators and mindfulness experts still suffer from this; it’s just a basic part of the human condition. But over time, it does become more natural, and the benefits are tremendous.

In some future posts I’ll talk about some quick ways to get started practicing mindfulness in our daily lives. For now, I hope I’ve stirred some interest in learning more about this crucial and fascinating practice.

3 Habits of Successful DBAs

I’ve been realizing more and more as I get older just how much of a habit driven creature I am. I’m at my best, both productivity and happiness wise, when I stick to my regular routines. Some would probably say I take this too far (perhaps when they see the detailed written checklists that I follow every morning and evening), but the truth is that it works for me.

As an example, look no further than the lack of recent posts or e-mails I’ve written. Over the last month or so I’ve been on two vacations, separated by a couple weeks of crazy busy times at work. As a result, I’ve been off my game for the entire time. Normally I have set times during the week when I sit down and write, and so long as I do, I’m pretty good at cranking out work. But when those times get disrupted, my production falls through the floor.

The same things is true in my work as a DBA. Over the years I’ve cultivated a number of habits and routines that have unquestioningly made me more successful. They span a number of different aspects of my work; some revolve around learning, others around procedures or tactics for completing my daily tasks. Here are three general categories, each with a specific example.

Set aside regular time for learning

Every day I spend around an hour of time reviewing some aspect of SQL Server that I’m not familiar with. Whether it’s reading blogs, reviewing articles on SQL Server Central, or playing around in my lab setup, I try and get in some focused time closing my knowledge gaps. Truthfully I don’t think I spend enough time doing this; even as it stands I find there are more areas of SQL Server that I don’t know about than I like (like using the new functionality around the ‘R’ language). But like many things, some time is better than none.

Just today I spent time going through lessons in Edwin Sarmiento’s excellent course on failover clustering internals. I consider myself fairly knowledgeable here, yet I still found that I learned numerous new things. For example, I knew that Cluster Shared Volumes exist, but never really understood what they were or how they worked. Thanks to Edwin, I now know that they can be hugely useful in increasing the availability of your clustered SQL Server databases (provided they’re configured properly of course).

Practice regularly – especially infrequently used skills

Pop quiz: when’s the last time you had to restore a database to a particular point in time? What about restoring a corrupt page? Re-establishing an availability group after a database gets out of sync?

If you’re like many, then the answer to the above questions may well be “never”. And I’ll be honest: that scares me. As DBAs we’re going to find ourselves against the wall in a crisis situation, we’re going to be called upon to fix things, and yet many don’t take the time to practice the very skills they’ll need in those situations. Yes, it’s true, with Books On-Line available you could probably muddle through, but do you really want to be searching through pages upon pages of documentation while the V.P. of Operations is sitting on your back?

Not too long ago, I sat down with my fellow DBA and we stepped through our procedures to recover one of our servers, relying completely on the documentation. This helped in two obvious ways: first, we became more familiar with the process, so that we could complete it quicker in a true recovery scenario; second, we found numerous gaps in what we had written down. If we had never taken the time to practice, we would never have found these.

Review the systems you own, with your own eyes

In today’s world of automation and fancy monitoring software, it’s not uncommon for DBAs to come to rely on their tools to alert them of problems. And while this is overall a very good thing (can you imagine trying to manually monitor hundreds of servers?), it can lead to a number of unintended negative consequences.

When I was still fairly young in my career as a DBA, I worked on a team that included a gentlemen who had been administering server hardware for more than half my lifetime. His knowledge of infrastructure was vast, but what really impressed me about him was his old-school attitude. I remember one time when he invited me to come down to the server room with him and walk the raised floor. When I asked why we’d do something like that when we have sophisticated monitoring software, he chuckled and just said “Let’s go.”

As we were walking the floor, we found a server that was showing failed hard drives as well as a failed power supply. Both of these are potentially huge problems that we might never have found had we not taken the time to do a visual check. He later told me that he did this (the walk through) at least once a week, just to “get a feel of things”. He explained that by keeping himself familiar with the physical space, he was able to notice things before they became larger problems. He also mentioned how while monitoring software is a wonderful thing, it can’t replace a good old visual inspection.

Much like how airline pilots perform a manual “walk-around inspection” before each flight, as DBAs we should be regularly laying eyes on the systems we administer. There’s a number of ways to accomplish this, from logging in to servers on a rotating basis to regularly reviewing performance baseline numbers and looking for deviations. What’s important is that we become familiar with “the lay of the land”, so that we notice differences.

Summary

These three examples alone will make a huge difference in the trajectory of your career as a DBA if practiced regularly. The key word here, however, is regularly. Like any habits, these will only provide value if they are done repeatedly and often. Here there’s a lesson we can learn from my psychology background, in that (to loosely quote bestselling author Gretchen Rubin’s book Better Than Before) the habit of the habit is more important than the habit itself.

Huh?

Think of it this way: it’s better to do some tiny thing related to these three habits each and every day, rather than try and devote massive amounts of time to them but only do so irregularly. Better that we take a thousand small steps over time than a handful or larger jumps, for as the great Greek philosopher Aristotle once said, “We are what we repeatedly do.”

If you want to be a better DBA, do the things that better DBAs do, and do them often. Sooner or later, you’ll get to where you want to go.

Get My Free SQL-Tools Scripts

Put in your e-mail below and I'll send you a link to download a set of utilities I've made over the years. They've made my job easier and I know they will for you too!

I will never give away, trade or sell your email address. You can unsubscribe at any time.

Powered by Optin Forms

To be a great DBA, treat your career like an investment

One skill I see commonly with great DBAs is that they are constantly learning. The read blogs, follow sites like SQL Server Central, Simple-Talk, or SQL Shack, and spend a ton of time perusing Books On-Line. They watch videos like those put out by Brent Ozar. They spend time playing around in their lab (because naturally, they have one), breaking things, fixing them, and breaking them again. They invest in their best asset: their skills and knowledge.

There is an amazing amount of good, free content out there if you go looking. And in many cases, learning by doing is one of the best ways to gain knowledge. But here’s the thing: if you refuse to spend actual money on your skills, you’re making a mistake.

Early on in my career I was adamant about not paying for anything. Why should I pay for what I can find on my own? And it’s true, there really isn’t much that you can’t find or figure out with enough time and effort.

Here’s the thing though: your time as a DBA isn’t free. Every minute you spend looking through Books On-Line (which while very complete, is sometimes a real bear to navigate) or struggling through a lab setup is a minute you could spend improving your processes, or doing actual revenue generating work for your employer (this is doubly true if you’re self employed).

Premium content is often highly focused and organized, with specific goals. It eliminates a lot of time thrashing around trying to figure out things. Thus, it allows us to learn more in less time. This is where premium content can really pay dividends; by paying for the ability to spend less time to learn the same knowledge, we are investing in our career.

Think about it. If you have electrical work that needs doing around the house, you could get a book from the library, watch some YouTube videos, and probably muddle your way through it. Maybe in time you could get really good at it. But, that takes time, which means ultimately, money. At some point, the cost of the time spent to learn it all yourself exceeds the cost to pay an electrician.

It’s the same thing with paying for premium training. Especially when it comes to highly complex subjects! If your goal is to get a deep knowledge of a particular area, and your time is limited (if yours isn’t, please tell me your secret), you owe it to yourself to at least look around and see what’s out there in terms of paid content.

Case in point: recently, I had the privilege of getting a chance to be an early participant in a phenomenal new course on Windows Failover Clustering by Microsoft MVP Edwin Sarmiento. Clustering is a great example of where this kind of training can be a godsend; it’s complex, with many moving parts (Active Directory, DNS, and network to name just a few), and while it’s gotten much better in recent versions of Windows, it’s still a bear if you don’t do everything right the first time. And let me tell you, Edwin’s course leaves nothing untouched. He starts off with the basics of what Clustering is (and conversely, what it isn’t), then progressively goes deeper into specific components. The module on Quorum is pure gold; I’ve been bitten by mistakes that Edwin calls out numerous times, such as neglecting to configure a witness. He uses clear, simple examples, along with plain real-world analogies to explain concepts like resource groups and status. I consider myself pretty well versed in clustering internals, but I have already learned more than enough to justify my investment in the course.

Lucky for you, Edwin just opened this course* to the public at a reduced rate, along with a slew of bonuses, such as access to an upcoming course on hybrid cloud and local high availability solutions (which I’m sure will also be phenomenal). The discounts are good for another two days, so do yourself a favor and get it while it’s still available. Believe me, it’s an investment well worth it.

*Full disclosure: if you purchase the course through this link, I earn a small commission. I did not receive compensation for this review however, and paid for access to the course just like you. Thanks in advance for supporting my kids’ future college expenses! 🙂

How Azure kicked my you-know-what

This past week was a whirlwind of activity, as we scrambled at work to spin up a new virtual environment in response to a last minute urgent need. Now that the dust has settled, I wanted to think back and reflect on things that I could have done better, as I often do after things go off the rails a bit. And they most certainly did go off the rails this week. I’ve said it before: Azure (and the cloud in general) is a game changer and I believe a net positive force on our industry, but it’s also requires huge shifts in how we go about our work, and sweeping away swaths of knowledge that are no longer valid.

Take disk layout for example. Traditionally, I’ve always like to have separate disks for system, user and temporary databases, with an additional one for backups. In addition, I generally size very conservatively for the sake of keeping costs low. But in Azure, virtual machines are limited in how many disks you can attach to them based on size, so we have to be very careful about leaving room for expansion. This already bit me once when I tried to add more disks (after going through all the work of installation and configuration of course), and boy did it stink to have to go back and redo everything.

As technology professionals, we’re going to be pushed at times (who am I kidding, more like often) to move quickly on new technology, even when we are far from comfortable or familiar with it. And if we’re good at our job, this is going to set off alarm bells in our heads as we think of all the terrible things that can go wrong, or that we don’t even know could go wrong. Good DBAs have a bit of a paranoid streak in them, because they tend to be very risk averse. But at the same time, we have to find a way to move with the times, less we risk being seen merely as naysayers and obstructionists, and be left behind.

When these pushes happen, there’s a few strategies that we can employ to ease our struggles and also keep our risks to a minimum.

First, prove out everything you do in a test environment as much as possible. I have a lab in Azure that I utilize on an almost daily basis to try things out and understand the various pieces before attempting to implement them for real. Especially when we’re dealing with new and unfamiliar technologies, experimentation and testing are key to ensuring success. And with the ability for anyone to bring up an Azure subscription and simply pay for whatever resources you use, the excuses to not have a lab ready to power up are pretty much nil.

Second, take meticulous notes on every action taken. After this week, I have pages upon pages of handwritten notes on things that didn’t work, things that did, and all the steps in between. After the dust (hopefully) settles later this week I intend to go back and organize these notes into more formal documentation that can be used again when the next need arises. It will also help allow creation of more automated processes, so that the next time isn’t so labor intensive.

Pro tip: if you use Powershell at all in your work (and if you don’t, why don’t you?!), the Start-Transcript cmdlet is a lifesaver. It literally records every action you run, as well as the output. I used it this week and was able to use it to produce a working script that carries out all the actions I did manually this time around. As I’ve said before, do it once, then automate it.

Finally, document any and all risks that you discover in writing, and make sure your boss is aware of them. You might think that you’re being a negative ninny, and perhaps you are, but it’s your job to make sure all the players are aware of potential issues. Note that this doesn’t mean that you refuse to move forward in the face of these! Throughout my career I have pushed through all sorts of uncomfortable situations, simply because those above me decided that, after all consideration, moving forward was still the best option. I have disagreed with them, vehemently at times, but once the orders are given, it’s my job to make the best of them. All we can do is make sure that our concerns are heard (and more importantly, understood).

When we raise these concerns, it’s very helpful to include a list of possible mitigation strategies, along with how each would impact the project at stake. For example, let’s say that you only have time to build a single stand-alone server, rather than a cluster or mirrored pair, and you’re worried about incurring possible downtime. When you document this concern, note how long it would delay the project if you were to be allowed to bring up and configure a second server. Perhaps you still won’t be allowed to delay the critical path of the project, but management will agree that in parallel your first priority is to get redundancy in place as soon as possible.

Let’s face it: change and challenge are inevitable in this business. When they come, we can complain and obstruct, or we can move forward with grace, make the best of the situation, and learn from our mistakes. Over time, we’ll get better, and we’ll be seen as an integral part of the continued success of the companies and clients we serve.

The surprising skills that could make or break your career

Perhaps it’s the teacher’s blood in me (both my parents were lifelong teachers), but I’ve always loved helping others learn and improve themselves. It’s one of the reasons I enjoy writing articles, giving lectures at my company, and working one on one with individuals who are trying to learn how to better use SQL Server. There’s few things more satisfying than seeing someone work through a tough problem, or show improvement in their skills, especially when I see that person advance in their career because of it.

For this reason, I’ve always made a point to think about the skills, habits, and traits that enable us to be successful in our careers. The more I read, research, and examine things, the more I come to a surprising conclusion: while technical knowledge and aptitude are absolutely a factor in your success, the most critical ingredients include a number of skills that have nothing to do with SQL Server, or any technology for that matter. Instead, these are what I’d term meta skills. Here’s some examples.

Knowing How to Learn

Continual learning is something that any competent IT professional must accomplish, lest we be left behind in the rapidly changing technological world. Unfortunately, acquiring and incorporating new knowledge and skills is not a simple task. Many folks go through school learning the traditional methods of brute force memorization and repetition, which as it turns out, are largely ineffective at producing long term retention and understanding. Instead, as authors McDaniel, Roediger, and Brown argue in their work Make it Stick, practice and un-aided recall are two of the best ways of ensuring information is encoded into long term memory. In addition, we might learn something from the methods of Benjamin Franklin, who made a habit of laboriously restating the words of others in his own voice.

Writing Effectively

Often times as database professionals we are asked to document things in written form. For example, we might need to explain in some detail (yet still in a way that is easily understandable) why a problem is occurring and what must be done about it. Or we might need to make the case for spending some hard earned capital on new hardware or a piece of software that will make our jobs easier. For this and many other scenarios, being able to clearly and effectively communicate one’s ideas is crucial. And yet, I have heard many sneer at the idea that one should spend effort at learning the basics of writing, improving one’s grammar, or understanding the process of constructing a well-thought argument. If you want to get your point across effectively, and have others understand, you need to at least spend time developing your writing abilities (blogging might be an excellent start!).

Time Management

As a DBA or database developer, you are going to have a large number of things demanding your time and attention at any given point in time. Since science has pretty definitively shown that multi-tasking hurts productivity, you are also going to have to learn how to manage all the various things on your plate. Here’s a hint: no matter how well you think you can manage all that in your brain, you’re likely to be wrong. Our brains are hard wired to keep un-finished tasks in memory, to the detriment of our concentration. While there are any number of valid approaches to managing the tasks and projects in your life, what’s important is that we experiment to find one that works well for us and stick with it. Whether it be a simple paper based system like Mark Forster’s Autofocus, or a highly complex and multi-part one like J.D. Meier’s Agile Results, the important thing is that you don’t rely on your brain to keep track of all these things.

I could go on, but I think those three are probably the most important three that come to mind. As important as these things are, I find that the number of people writing about and teaching others on these subjects is far fewer than those focusing on purely technical subjects, especially coming from within the IT profession. As a result, I believe it’s an area that is ripe for an influx of guidance and simple, step by step processes for improvement. I’m not sure how yet, but I’ve resolved to make this a large part of what I contribute to the community as a whole. In the mean time, I’d love to hear from you as to your thoughts on this as well.

Get My Free SQL-Tools Scripts

Put in your e-mail below and I'll send you a link to download a set of utilities I've made over the years. They've made my job easier and I know they will for you too!

I will never give away, trade or sell your email address. You can unsubscribe at any time.

Powered by Optin Forms

My father made me a better DBA by teaching me these two things

Today is Father’s Day in the United States, so naturally I spent most of the day with mine, hence no long post today. But still, I thought it appropriate to talk a little bit about how he’s influenced my way of thinking in ways that have helped me in my path as a database professional.

He’s a retired physics teacher who taught for over four decades, spanning both high school and college. Well, semi-retired at least; he keeps taking the odd teaching job or holding seminars teaching other teachers. I asked him today, “Do you think you’ll ever really stop teaching?”

He laughed, thought about it for a moment, then decided the answer was no. “I just like teaching people too much,” he said. “I’d do it for free.”

He’s a phenomenal teacher, solid in every way, from lecture to handling rebellious teenagers in his class (I would know, I was one). It’s amazing how when I talk to his former students how much respect they have for him. I once was talking to a friend of mine who was in one of his classes, and I asked her if she enjoyed them. With a sheepish look on her face, she admitted that she didn’t give it her full attention, but even so she looked back fondly on the experience and remembered how his passion for teaching shone through. She implored me to apologize on her behalf the next time I saw him, for “being that slacker kid”.

When I mentioned this to my dad, he shrugged. “I don’t remember her being a slacker,” he said. “Besides, if she was bored, it was because I didn’t do a good job making the material interesting.”

His real strength and passion has always been laboratory work. Growing up, I used to love going with him to his classroom, where we’d set up elaborate stations of equipment, with rails and boards and all sorts of measurement devices. Each one illustrated one or more concepts he taught in a clear and understandable way, while adding an element of fun to the process. My favorite was probably the compound bow he would set up such that it shot an arrow into a canister filled with clay. By measuring how far the can swung, we could begin to calculate the force produced by the bow.

Looking back on the whole experience, I can clearly see two ways in which he has had a substantial effect on my approach to my career and work in general.

First, he kindled in me a desire for constant experimentation and observation. He’s constantly setting up scenarios and experiments, asking his students to predict what the outcome will be, then forcing them to gather data to support their hypothesis. Even today over lunch, at one point he drew out an electrical circuit diagram and asked me how I thought it would work (thankfully I was correct!). The constant mental exercise of observe, predict, gather, and analyze is one that I have used many times in troubleshooting problems with SQL Server (I even wrote an article on the subject). Too many times I’ve seen people in technology approach their work in a haphazard and scattered way; everyone would do well to remember the lessons taught by their high school science teachers on the careful and methodical approach of science.

Second, his passion for teaching others infected me with the same love of helping people learn. While I’m not a teacher by profession, the role of a database administrator (especially a senior one) is simply loaded with opportunities to teach others. Whether it’s helping a developer who has never learned how to think in sets escape the object oriented paradigm, or designing a series of standard diagnostic steps for front line customer support when dealing with SQL Server related problems, I get tremendous satisfaction out of transferring my knowledge to others. To see them grapple with the information, twist it around in their heads, and to suddenly gain an understanding of a new concept and experience that heralded ‘aha’ moment… well, let’s just say it’s easily one of the best parts about my job.

Even though he’s not the most technically savvy person (a fact that we both laugh about), and these two lessons are not strictly technical, I have no doubt whatsoever that they’re a part of why I have been so successful in my work. So today, I am thankful for the influence he has had on my life; I can only hope that one day I am half the teacher he is.

Today’s question: who in your life, outside of work, has influenced you in ways that have helped you in your career? I’d love to hear your thoughts in the comments.

One skill all good DBAs must have

My father in law is a skilled woodworker and artist who has been creating beautiful kinetic sculptures for over 30 years. If you stroll around his workshop, you’ll see many examples of odd looking contraptions built out of scrap lumber. Ask him about these, and he’ll usually readily recall a time when he needed to align two pieces of wood in a certain way, or drill two holes just so far apart. Because he builds sets of nearly identical pieces, he quickly realized that by creating and utilizing these, he could produce product at a far greater speed and with more accuracy.

In woodworking terms, these are known as jigs. They are basically tools that woodworkers use, often constructed by hand, that allow them to more easily perform routine tasks. For example, if they need to cut many pieces of wood in the exact same way, they might make one ‘master’ cut, then base all other cuts off that one. That way, they don’t have to measure each and every time (and possibly introduce error in the process). While these tasks could be done with existing tools (measuring tape and pencil), the simple creation of a jig results in a much higher rate of production along with more accurate work.

As DBAs (and really people who work with databases in general; this applies to you devs too!), we are inevitably going to encounter painful processes in our work. There are going to be tasks we must complete where our existing tools aren’t up to the job, and as a result we’re going to spend time battling through cumbersome work. Here’s an example from recent memory.

As I’ve written before, I’ve been working a lot more with Azure based solutions as of late, specifically Azure SQL Databases. There’s a lot of differences between them, not the least of which is that it’s darn near impossible to easily execute a query against all databases assigned to a particular Azure SQL server. Sure, there’s elastic database queries, but that requires a lot of pre-meditation and setup.

Can’t a guy just get some good old fashioned sp_msforeachdb style help?

So, I opened my trusty Powershell editor, and cranked out a function that basically replicates the majority of the functionality in that venerable undocumented procedure. Now, I can pass in a server name, credential information, and a query, and get back a nice result set that I can then pipe to something else or simply save as needed.

Is it perfect? Far from it! It would be fantastic to have this inside of Management Studio where I already spend most of my days. Still, it’s a huge help and a step in the right direction. And if I find I use it a lot, then I can always devote more time towards improving it.

The bigger lesson from this experience is this: the ability and motivation to create tools for ourselves is a critical skill if you are going to advance in your career as a database professional. Whether it’s Powershell, straight Transact-SQL, or even something more complex, we must put ourselves in the mindset of a craftsman. When we encounter pain, we need to create things that lessen it.

There will always be things we must repeatedly do, and some times those things may not be easy to do with the tools we have at our disposal. Learn the ability to create your own tools, and you’ll be well ahead of most of your peers in no time.

P.S. If you think the function that I wrote would be helpful to you, I’m giving it away! Enter your name and e-mail below and I’ll send it along.

Why Azure SQL Database is a Great Thing For Small Businesses

At work as of late I’ve been learning quite a bit about the Azure SQL Database offering from Microsoft. It’s been both a fascinating and frustrating experience. Azure SQL Database has many fundamental differences from the traditional model of SQL Server, many of which require a lot of re-thinking about how we manage our databases. For example, did you know that you cannot write queries that join tables from different databases? Or that you cannot change database contexts once you’re connected (no more “USE Database”)? And those are just two of the many. But that’s for another day; today I want to explain why I think the Azure SQL offering is a fantastic thing in terms of getting small businesses to adopt SQL Server as a technology.

Let’s say that you are an IT consultant specializing in helping small businesses (especially really small businesses, like Mom and Pop shops). One of your customers wants to start using an accounting package that requires SQL Server, so they ask you to help them understand how much it will cost them. Now this is a really small firm with extremely limited budgets, so we’re going to have to compromise on some things (sorry, no Fusion-IO storage). Even so, when we look at the costs of a bare-bones server class machine, along with the costs to acquire the software licensing required, it’s still pretty steep.

  • Hardware: $1,819 (and I really did cut corners; this has 7200 RPM SATA disks and only 8 GB of RAM)
  • SQL Server 2016 Standard Edition CAL based license: $879
  • 5 SQL Server 2016 Device CALs: $1,049
  • Total: $3,749.40

For many small family run businesses, this amount is not trivial. Plus, you must consider factors such as:

  • Setup of backups and maintenance
  • Handling issues such as hardware or operating system failures
  • Disaster recovery

Between the initial up-front cost and ongoing maintenance, this represents a significant investment both on your part and the part of your customer.

Now, let’s consider the same scenario, but use Azure SQL Database as the platform to meet our SQL Server needs. First, let’s assume that the volume is small enough such that we need around 20 DTUs of capacity on average (and even at peak times, say, running end of month processing, we might need around 50). Given this, the S1 service tier meets their requirements. This costs around $30 per month (as of publish date, see this chart), with no up front capital required. Let’s say that for around 24 hours of the month (three working days), we need to bump the database to the S2 service level to accommodate extra month end processing. Adding that in adds a paltry $2.42 worth of extra cost, bringing our total to around $32 a month. At that rate, it would take approximately 117 months to equal the same as the initial up front cost of the traditional solution. That’s close to ten years folks. Talk about a way to spread out your capital cost!

Even better, much of the overhead for your customer is no longer an issue. For example, there is no need to configure backups, as they are automatically put in place. Azure SQL databases are automatically locally redundant, with a 99.99% up-time guarantee. If your customer wishes to have a disaster recovery plan in place, then you could utilize the active geo-replication feature, and your customer only pays for one additional database (bringing the cost to a whopping $64 per month).

This is truly nothing short of a game changer. I mean, can you imagine being able to have a conversation with a tiny family owned business that includes disaster recovery?

Now, it’s true that Azure SQL Database has its limitations and drawbacks. As we mentioned earlier, some notable features are not available, which may limit what applications can use the offering (no SQLCLR for example). In addition, it does introduce a critical point of failure, namely the customer’s internet connection. Still, with the tremendous cost savings up front, something like a backup internet provider connection suddenly becomes a realistic possibility.

Time will tell if the Azure SQL Database offering becomes as successful as I believe it will. But one thing is certain: Azure gives us far more cost effective and lower risk options to bring customers on board with utilizing SQL Server.

Get My Free SQL-Tools Scripts

Put in your e-mail below and I'll send you a link to download a set of utilities I've made over the years. They've made my job easier and I know they will for you too!

I will never give away, trade or sell your email address. You can unsubscribe at any time.

Powered by Optin Forms

Principles of Good SQL: Aggregate Early

When writing T-SQL queries, it’s important to reduce the volume of data down as quickly as possible. Larger volumes of data mean more input / output operations (think more reads from and writes to disk), more CPU cycles (hash joins with large sets of rows versus smaller ones), and more overhead for acquiring and releasing locks on resources in SQL Server. By shrinking down large sets of data early on in the query, we can greatly increase the efficiency of our operations.

Here’s a simple example. Using the standard AdventureWorks2012 database and Jonathan Kehayias’s excellent scripts to enlarge it, we can show how pre-aggregating data can reduce the amount of CPU resources required for a query.

Running this through the excellent (and free) tool Plan Explorer from SQL Sentry, we see that it consumed around 8.1 seconds of total CPU time. Examining the plan a little more closely, we can see that a large number of rows flowed through until the very last step, where the Stream Aggregate operation occurred.

2016-04-17 16_23_47-SQL Sentry Plan Explorer PRO
Click to enlarge

Some of those operations were hash joins, which require CPU resources to compute the various “buckets” that rows are placed into based on the hash key (perhaps I need to write something explaining this in more detail, but that’s for another day). Thus, the more rows that pass through them, the more CPU the query execution will require. Thus, if we can reduce the number of rows that pass through these hash join operations, we can likely reduce the amount of CPU resources required.

Looking at the query, we can see that it is grouping at the level of the customer first and last name, as well as the product name. Following the joins, we see that these attributes are linked to the columns SalesOrderHeader.CustomerID and SalesOrderDetail.ProductID. So, what would happen if we grouped as soon as we had those attributes available, rather than at the time the integer values have been expanded to their full names? We can do this by using a derived table in the query as follows.

Running this query through Plan Explorer, we find that this indeed had the desired effect: the number of CPU seconds consumed is reduced down to just over 3 seconds (a roughly 62% decrease). In addition, when examining the plan we can see that the optimizer moved the aggregate operation earlier in the plan, thus reducing the number of rows that flowed through the hash join operations.

2016-04-17 16_37_35-SQL Sentry Plan Explorer PRO
Click to enlarge

There are many other ways this approach might help, such as limiting the amount of query memory required for sorts. The derived table approach also may not always work; SQL is a largely declarative language, in that you tell it the data you want, and it decides how to retrieve it. We can give it hints, either implicitly (as we are doing here), or explicitly, using table hints (a thing that should rarely be necessary), but ultimately it controls how it answers our request. Thus, it’s critical that we check the execution plans of our queries, and that we run against representative sizes and compositions of data. This is the only way we can give ourselves a chance at robust, well performing SQL code.

Get My Free SQL-Tools Scripts

Put in your e-mail below and I'll send you a link to download a set of utilities I've made over the years. They've made my job easier and I know they will for you too!

I will never give away, trade or sell your email address. You can unsubscribe at any time.

Powered by Optin Forms