Due Care vs Due Diligence

After a few discussions this week, an information security governance topic that comes up often and creates confusion (especially with those learning about governance) is that of due care vs due diligence.

TL;DR: Due care is the thought put in to securing your environment by creating policy and procedure to protect it. Due diligence is the effort you put in to making sure those policies/procedures are enforced and utilized.

To be clear, these are not terms limited to information security. Anyone who has had to sit through countless compliance meetings when dealing with government regulation bodies like the Securities and Exchange Commission or health care governing bodies knows that these terms get floated around a lot in the compliance field. We just happen to be focusing on their relationship to security.

It can be difficult to grasp these terms since people explain them sometimes in ways that are confusing. You’ll hear that one is thought and one is action. One is feeling and the other is effort, etc. But it can still come out fuzzy in the wash so let’s actually look at the terms themselves. To do that we must look at what they both relate to in the first place: protection.

The ultimate mission is protection. Depending on your industry, what you’re protecting could be something different but ultimately, you’re trying to make sure that the thing you’re protecting is treated properly. So let’s look at the terms now…

They both begin with “due“. You may have heard a phrase in reference to ‘giving something its due’. Basically, that means you’re affording that item what it deserves. In this instance, if our goal is the protection of people, systems and data, we need to make sure that in both instances we’re giving them the amount of protection they deserve. So now we look at the unique terms: “Care” vs “Diligence”.

If you care about something, it matters to you. That means that you put thought into how you treat it and ultimately how you “care” for it. If you care about your child, you establish rules and guidelines for that child. You set boundaries for them because you want to protect them (even from themselves). If you care about the clients, personnel, data and systems you’re endeavoring to protect, then you take time to think about and create policies to protect them from harm, abuse, unauthorized access, accidental damage and destruction, etc. That’s the basis of “due care”: from a management/oversight level, taking the time to think about and create meaningful and useful policy for the protection of your environment.

Here’s the thing, though: care doesn’t do you much good if you don’t have follow-through. If you set boundaries, rules and guidelines for your child and never enforce them, what good have you done? That’s the purpose of diligence. Due diligence is the execution of due care. It’s the diligent effort placed into making sure that policies and procedures are utilized. When you exercise due care by enabling logging on a secure system, what good is it if you’re not diligent about reviewing those logs? Diligence is required to make the initial care you put in valuable.

The interesting thing about these terms to me is how distinct yet dependent they are. Due care is useless without the effort to make it worthwhile through diligence. But due diligence means nothing if you’ve not taken the time to establish the appropriate policy to protect what’s important to you. You can diligently push a rock up a hill all day but what good is that? These two concepts go hand-in-hand in the establishment and continued success of a strong security framework.

Terminology Overview for Documenting Security Governance

It’s very possible that title is the most boring 6 words I’ve ever composed and I used to have to write lines on the chalkboard in school. A lot.

For the purpose of Information Security governance and studying for various management-level Infosec exams, though, this terminology is very important. Understanding not only the terminology but more importantly how the terminology relates to one another helps guide the entire governance and policy process.

If you’re anything like me, you struggle with documentation. Policy is not something I tend to enjoy, especially when I have the workload of 5 people. Actual physical tasks tend to win out over documentation, organization and proper structure. That’s not something to strive for. We all need to make time for documentation.

First and foremost, understanding one’s role in an organization is key. Information Technology, Information Security, Help Desk, QA, Development, etc. all support the business. In most cases they are not the business itself. As a result, support departments need to align their planning with the overall mission of the organization. Doing so makes buy-in from upper management much more likely and easy to acquire. So the top of the governance food chain belongs to:

Mission Statement: This is a simple and straightforward expression of what the company endeavors to do, for whom and the values it applies to that end. It’s a concise overview of the how the organization sees itself. (A personal mission statement might be: I wish to be healthy so that I can live a long and happy life with my wife.)

How does the organization plan to achieve that mission? That’s what the strategy is for. The strategy is there to explain how the organization plans to accomplish it’s mission. If the mission statement explains the What?, Why? and Who?, the strategy covers the How? A mission statement generally remains the same, a strategy changes often in an effort to adapt to fluid business environments and landscapes. (A personal strategy related to the above mission statement could be: I am going to eat in a healthy manner and exercise to lose weight.)

A strategy is a high-level plan for execution. It will cover the plan of attack, so to speak, without getting into the actual details. To describe the actual details of execution, an organization establishes goals. This is a term we’re all fairly familiar with but for these purposes it may be a little more narrowly defined than we’re used to. A goal, in this terminology, is a relatively intermediate-term desired outcome. It’s unwise to establish goals that are too broad or long-term as they tend to be less attainable. (Personal goals might include: a) Lose 10 lbs. and b) Incorporate more vegetables into my diet.)

The final rung on this ladder is the objective. The objective allows us to define shorter-term, more finite tasks for the purpose of reaching the goal. We often have numerous objectives with quick deliverables so that we can continue seeing progress on our way to achieving a goal. (Personal objectives might include: a) Go to the gym on Monday, Wednesday and Friday for at least 30 minutes, b) make a meal plan for this week and c) purchase the necessary vegetables at the grocery store.)

Now you may often find that goals and objectives are used interchangeably. I had to actually strive not to use them to describe each other as I wrote this! But for this purpose, they are distinct structures for the purpose of proper organization with defined time periods.

Mission Statement and Strategy (LONG TERM) are carried out by setting specific Goals (INTERMEDIATE TERM) which are executed by setting and achieving Objectives (SHORT TERM).

Hopefully that made sense 🙂 In the following article, we’ll talk about how the strategy is not only accomplished through the execution of goals but also the establishment of policies, standards and procedures.

SQL The Memory Hog

I’ve worked with Microsoft SQL Server for a long time. In the context of people who study SQL to a degree I won’t ever find possible (or palatable), I would not refer to myself as an expert. I have had my fair share of issues I’ve had to find resolution for and this one comes up frequently since it’s how SQL installs by default.

You just get your new SQL Server up and running and within a short while of using it, your machine is running unfathomably slow. You manage to get Task Manager open and find that sqlservr.exe is pretty much chewing up all the RAM. “That makes no sense,” you lament. “I don’t even have that much running on the SQL Server yet.”

The reason: By default, SQL Server sets the “Maximum server memory” option to just a shade over 2 billion megabytes (or the largest possible value a 32 bit integer can hold: 2,147,483,647). Likely, your server doesn’t have 2 billion gigs of RAM so SQL Server just creeps northward, holding all your RAM hostage, whether you’re using it or not.

Fortunately, this is an easy problem to fix.

Resolution:

  1. Login to SQL Server Management Studio and connect to your server (with high privileges, obviously).
  2. Right click on the server name itself.
  3. Select Properties.
  4. In the Server Dialog Properties window, click Memory.
  5. Change the “Maximum server memory (in MB)” value to an appropriate portion of the RAM on your server.

Before changes:

SQL before

After changes (in a server with 32GB of RAM):

SQL after

Now bear in mind, setting this value too low will naturally cause performance issues for your server. You want to set it lower than the full value of the RAM on the server, but not so low that you’re severely restricting SQL Server’s ability to perform.

This is a first line solution. There may be issues with how your SQL code is written, OS-level performance misconfiguration or more advanced SQL Server configuration issues with paging and so on. But 99 times out of 100, on an initial, default installation of SQL Server, this solved the problem. You may need to test max memory values for what works on your particular setup in order to find the sweet spot. Enjoy.

T-SQL: Stored Procedure vs Query Performance

The query ran fast. The stored procedure of that same query is slow.

Let’s do this. For those looking for a solution and no explanation, we’ll start with the answer and finish with an explanation.

Solution:

You started with a query that looked like this:

 DECLARE @ PersonID int

SELECT * FROM Persons WHERE PersonID = @PersonID 

It worked great. Fast as can be. Then you created the stored procedure:

CREATE PROCEDURE [dbo].[dbsp_Get_Person_From_PersonID]
@PersonID int
AS

BEGIN
SELECT * FROM Persons WHERE PersonID = @PersonID
END

And now it’s slow. You can fix this by adding an internal variable and assigning the new internal variable to the value of the external variable and using the new internal variable in your query:

CREATE PROCEDURE [dbo].[dbsp_Get_Person_From_PersonID]
 @PersonID int
AS

BEGIN
 DECLARE @PersonIDInternal int

 SET @PersonIDInternal = @PersonID

 SELECT * FROM Persons WHERE PersonID = @PersonIDInternal
END

That’s it. Hopefully that solves your problem. If you’re curious why such an irritatingly redundant step could be necessary, feel free to read on.

Explanation:

Before I figured out the solution to this issue, I think I lost a few hairs to the gray side. I’ll describe the scenario first:

You’ve worked very hard to build a SQL query (maybe you didn’t work that hard but we’ll pretend you did anyway). Being the good developer you are and keeping your anti-SQLi (SQL Injection) wits about you and desiring that sweet SQL optimization, you convert your reusable code into a stored procedure. There’s only one problem:

The query ran fast. The stored procedure is slow.

Now the degree to which this takes place depends on the query. I’ve had queries that ran instantaneously that turned into 30 second queries as a stored procedure. You’ll pore over that code, trying to find what you could have done. What has changed? A little feature of SQL Server Optimization called Parameter Sniffing.

SQL knows you want to reuse that stored procedure and for queries more complex than the ones we demonstrated above, SQL doesn’t want to figure out how to execute that code every single time it runs it. So SQL Server creates an “execution plan” on exactly how to get your data and then caches it, theoretically to make it execute faster.

However, during that process of creating an execution plan, SQL automatically assumes that every time you run this query, it’s going to run the same way. It assumes that because you’re using parameters. You see, depending on what parameters you pass in to your query, you could get very different result, especially if you have a complex query designed to return a variable number of records. Imagine if, in the query I listed above, sometimes it returned 1 record, other times it returned 10,000 records. If you were just running that in a query window, SQL would probably build different execution plans for optimization. But because it’s in a stored procedure and SQL has cached the execution plan, it runs the same way, every time.

That’s not a bad thing unless that’s precisely what you DON’T want to happen.

By providing an internal variable as in the example above, you can force SQL to build an execution plan that is optimal for the query you’re running.

Another way of doing this is using the “WITH RECOMPILE” statement.

CREATE PROCEDURE [dbo].[dbsp_Get_Person_From_PersonID]
 @PersonID int
WITH RECOMPILE
AS

BEGIN
 SELECT * FROM Persons WHERE PersonID = @PersonID
END

This forces SQL Server to recompile the query every time the stored procedure is called.

Here’s Microsoft’s blurb on it:

Another reason to force a procedure to recompile is to counteract the “parameter sniffing” behavior of procedure compilation. When SQL Server executes procedures, any parameter values that are used by the procedure when it compiles are included as part of generating the query plan. If these values represent the typical ones with which the procedure is subsequently called, then the procedure benefits from the query plan every time that it compiles and executes. If parameter values on the procedure are frequently atypical, forcing a recompile of the procedure and a new plan based on different parameter values can improve performance. – (https://msdn.microsoft.com/en-us/library/ms190439.aspx)

 

SQL Server – Last Update and Access Times

I periodically review the usage statistics on my databases to determine what is outdated and no longer used. Here’s the easiest way I’ve found to it:

You’re looking for the sys.db_db_index_usage_stats table. It contains a wealth of information and I recommend having a look at it when you get a chance. But for now, we just want to know when the table was last accessed or updated.

(CAVEAT: The counters/datetimes are reset when the SQL Server service is restarted so the information you get is since the last restart, not the last access/update time historically. This means that if you restarted the SQL Server service 10 minutes ago, it will probably look like none of your databases have ever been accessed.)

How to look at last access/updates for a specific table:

USE [MyDatabase]
GO

SELECT
OBJECT_NAME(object_id) as Table_Name,
last_user_update,
last_user_seek,
last_user_scan,
last_user_lookup, *
FROM sys.dm_db_index_usage_stats
WHERE database_id = DB_ID('MyDatabase')
AND OBJECT_NAME(object_id) = 'MySpecificTable'
GO

If you’re looking for the last access/update times for the whole database:

USE [MyDatabase]
GO

SELECT
OBJECT_NAME(object_id) as Table_Name,
last_user_update,
last_user_seek,
last_user_scan,
last_user_lookup, *
FROM sys.dm_db_index_usage_stats
WHERE database_id = DB_ID('MyDatabase')
ORDER BY Table_Name
GO

The system uses all kinds of index operations to keep a count of this information and it’s very handy when you need it. Enjoy.