SQL Server – Database Consolidation

imageAs more and more databases come into an enterprise, the question of database expansion and consolidation is often brought to the table.  Capacity expansion costs money, not just software licenses, but also in maintenance and hardware.  At some point there has to be a trade-off between the cost of keeping up performance and storage and the performance of other applications.  This is especially true with the proliferation of SharePoint in many organizations, where the first impulse is to treat SharePoint like a file server which quickly eats through storage space.  However, through database consolidation companies can reduce the number of database servers that they have to manage and reduce the demand for servers and storage. Consolidation can also provide greater operational efficiency. If you can show a better Return On Investment (ROI) for capital expenses like servers, your finance department will love you. 

Types of database consolidation

There are three basic types of consolidation, this includes physical server consolidation, SQL Server instance consolidation and storage consolidation.

Physical server consolidation

This is one of the most popular consolidations, especially as companies find that they have multiple servers that are all underutilized and Microsoft has gone to the per-CPU licensing model.  The physical database server consolidation is the process of consolidating multiple underutilized physical SQL Servers into fewer physical SQL Servers. It is a common problem that many organizations have a number of physical database servers that are underutilized and time must be spent maintaining and patching them.  This often occurs in project-based environments where a new physical database server was commissioned each time an application requiring SQL Server database is deployed. As a result, these organizations end up having multiple instances of various versions of SQL Server servers scattered among various Windows servers running multiple Operating systems.

This type of consolidation helps organizations reduce the number of physical database servers by identifying and consolidating the multiple underutilized physical database servers into single physical database server running multiple instances of SQL Server. In this way, these organizations could end up with the same number of instances just placed in as few physical database servers.

Physical server consolidation is going to save you in terms of hardware, OS, patching maintenance and third-party software licensing.

SQL Server instance consolidation

Each instance of SQL takes up a chunk of memory.  Often the instances are underutilized, each instance being spun up in an attempt to isolate the application and rights to one particular instance of SQL.  This type of consolidation helps to reduce the number of SQL Server instances by consolidating the databases hosted on these instances to fewer instances, freeing up memory.  Instance consolidation might have to do with just dealing with one server, for example you might have seven or eight instances on one server and you want to combine those databases into one instance on that server. This type of consolidation saves on CPU and memory usage, as well as patch maintenance as you have fewer instances to patch during the maintenance window.

Storage Consolidation

Very common in today’s high end storage solutions where a SAN is used for shared storage.  By consolidating the SQL storage companies can invest in high end solutions that are shared among several physical servers, which allows the enterprise to centralize the storage among multiple servers. By consolidating the storage solution this makes a good case for buying some of the higher performing (and more expensive) storage solutions while at the same time making backup and data archiving much simpler and can even include incorporating things like SAN replication keeping in mind you’ll need to review LUN and RAID levels on a regular basis to make sure you’re not running into I/O conflicts with chatty databases that are more heavy towards read or write, for example transactional databases vs. data warehouses and reporting.

Hybrid consolidations techniques

For most enterprises, a combination of the three types of consolidation is usually the most economical depending on the business needs. For global organizations, it does not make sense to consolidate everything into one North American data center if there are specific applications only used in Europe or Asia.  Thus what you typically see is a main datacenter consolidation with satellite data centers consolidated based on performance needs.  However, experience has shown that where possible it is pest to centralize your data into one location in order to simplify administration and maintenance and reduce the overall administrative costs.

Advantages of Database Consolidation


Microsoft has moved away from the per-seat licensing model and is headed full steam into the realm of per-CPU.  This means the more databases you can consolidate on a CPU, the more money you save on licensing.  This also means that if you can reduce the number of physical servers, you reduce the number of OS licenses, administrative tools, antivirus solutions, backup and monitoring tools, even rack space.  This can add up quickly to thousands of dollars in savings.  Especially as when our physical servers have proliferated and we have a whole rack or three of underutilized physical servers that can be services by a couple high performance servers.  Remember, if our databases are dispersed unnecessarily, this means that we are paying for extra licenses, which we probably don’t need. With the help of database consolidation, we can easily reduce this unnecessary license cost, because we don’t need to pay the extra licenses and maintenance fees for those decommissioned SQL Servers.


I’ve seen a lot of powerful servers out there, bought at the time because the project demanded the best in hardware and performance.  Those same servers are running at 5%-10% CPU utilization and only use 4GB of a 24GB memory stack.  It happens.  Departments are sometimes put in a position of use it or lose it and go a little hog wild with their purchases, or the original estimates for performance or capacity were overblown, or even egos get involved of having the biggest and meanest servers.  Database Consolidation helps us to reduce number of physical servers, and those freed up servers can be used elsewhere on new or existing projects.  This only really works well when you have a capacity management group that understands the history of an application and the resource utilization, but it’s well worth the time and effort.  Especially when you have a $60K server running an application where a $5K would perform just as well.  Better to take those servers and consolidate on a few high performing servers and free up the resources.

Soft costs

Electricity costs money, both to power the server and cool it down.  If you’re renting space in a CoLo datacenter rack space costs money.  Every SQL server you run should be managed by a DBA – generally the ratio is something like 12 servers to 1 DBA.  If you have 50 or 60 servers, then that forces you to hire an additional DBA that you might not need if you actually consolidated your databases down to 20 or 30.  Freeing up a DBA means they can work with development to help fine tune processes and be able to tweak systems to get the best performance possible instead of just barely being able to handle the day to day maintenance tasks.


Database consolidation also allows us to spend less on network traffic for things like configuration management and monitoring software. For example, let’s say we’ve got some third party monitoring tools that is installed on each our windows servers that monitor and automate patching. If we consolidate our databases down to just a few servers, then this means that we have to do fewer configurations every time we changes to the software configuration, specially with Patch Tuesday and the various security bulletins that Microsoft puts out.  Less servers means a lighter load on the network from monitoring data packets and transmission of patch files.  It may also mean going from a 50 port switch to a 10 port switch.  Database consolidation reduces the networking costs, not only from physical assets such as routers, switches, and cables; But it also reduces the utilization because fewer payloads are being transmitted and potentially with fewer servers that means quicker patch cycles so less risk surface is exposed.

Standardization of SQL Server infrastructure

So this has managed to bite more more than once.  You think you have all your standards in place, you’ve tested your patched before deploying, and then you deploy and something breaks because there was this one server over here that is about 5 patch cycles behind that nobody remembered to test at the time.  If we have defined set of standards for our SQL Server instances and fewer instances to monitor, then there are less chances of a surprise when we are required to make changes.  More importantly, as your standards change you have fewer servers to update to stay in compliance with your standards.  It’s all about no surprises.


Database consolidation also helps us to improve infrastructure security.  We can review the security levels and perform audits in less time because there are less servers.  If someone leaves the company and the policy is to change all passwords of accounts they have access to… imagine the overhead of doing this if they had access to SQL accounts instead of just rights given by AD (Which is a whole other topic).  By reducing servers you reduce surface levels of attacks. For example, if we have our databases dispersed among a one hundred servers as supposed to just ten servers then this means that there are 10x the number of chances of attack because we have to manage and maintain more users and logins.

Where to Start?

With a solid case for consolidation under your belt, the next step for enterprises is deciding how best to go about it. In general, like most projects, there is a logical set of steps to take in order to take on a database consolidation process, which can loosely be translated to the ALM process that Microsoft has made so popular in the Developer world with Team Foundation Server.

Typically database consolidation will involve at least these three steps, but of course mileage may vary and you’ll add additional steps as needed depending on the size of the consolidation, scope and any regulatory requirements.


Figure out what you’re going to do.  Why are you doing it?  What is the cost benefit ratio of doing this project and the risks if you do not.  The project may die on the vine at this step, and that’s OK.  This is the step where you figure out not only do you want to do the project and does it make financial sense, but where is the finish line.  What does the final result look like and how do you know if the project has succeeded.  Set your goals.

Create a Plan

One of my early managers was always fond of saying, “Plan the work, work the plan.”  If you don’t have a task list and people don’t know who is responsible for what, how do you hold them accountable and how do you know if you’re on track?  At what point are you behind schedule and what impact does it have on the rest of the timeline?  In this stage we identify and document the major steps that are involved in database consolidation project. For example, what kind of hardware do we need, who is going to order it, how much does it cost?  What are the lead times for installation and configuration?  We need to identify resources involved and make sure we’re on their calendar.  Most importantly we need to identify who the key stakeholder of the project is and who is bringing the money to the table to pay for all of this.  If they don’t agree with the goals you set during the Envisioning phase that needs to get hammered out now.  Ultimately, as fun as new technology is, if the business isn’t happy then nobody is going to be happy.  Make sure the goals and steps are clearly defined and that you also include in your plan frequent status meetings to keep key stakeholders up to date.  The biggest reason to have a plan is so that there are no surprises down the road, so remember that plans change and always keep yours up to date.


Work the plan.  Order the hardware, get resources working on the project, assign tasks and make sure they are completed when expected.  Each step should be verified and checked off the list when complete.  The plan should be available to all (put it on a SharePoint site!) who are involved in the project.  Remember, a successful project means no surprises.

In Conclusion

Database consolidation has several benefits such as reducing administrative and operational costs, simplified administration, improved infrastructure security, and better usage of computing resources and capacity.  This is a topic that should be regularly visited.  Remember that the enterprise evolves and you consolidate when and where it makes sense.  One of the most frustrating things I’ve heard at companies is “Well, yes we looked at database consolidation a few years ago but it just didn’t make sense.”  This was back when they only had 10 servers and now they are running 40…  This should be a strategic policy and at least once a year should be revisited because businesses change, technology changes, and what may not have been a business need a year ago may be a money maker/saver today.

SharePoint | Powershell To Create Personal Site For All SharePoint Users

From time to time I need to set up a somewhat realistic testing environment, especially recently when I’ve been playing around with personal sites and setting up profiled dashboards for different users ( Tech support, IS Manager, Line of Business) in order to test out applying certain pages or webparts to specific roles (Which I don’t really want to run against production just quite yet).  I don’t know how useful this would be in a production environment, however it comes in very handy when I’m setting up test environments and want to fill out the MySite area with a bunch of personal site collections for testing.  I could just set up a replication from production or copy the profile database from production, but this way I don’t even need to touch production to get what I need for testing purposes.

The key here is just to run through the User Profile Manager and get all the user profiles in the UPM and then if the Profile’s .PersonalSite property is $Null we go ahead and create the site for them.  At the end, don’t forget to .Dispose the SPSite!

I also go through and set up an SP personal admin account to be admin for all personal sites we create.  This is so that later when it’s in production, I can give someone in HR the rights to review everyone’s personal sites without having to give them Farm Admin rights, because let’s face it that’s just a scary idea.

# Code to create personal site for users if they don't already have one.
# Assign a Personal Site Admin account to each PersonalSite so we don’t need to give Farm Admin

#Add our SnapIns...
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) {
    Add-PSSnapin "Microsoft.SharePoint.PowerShell"


#Set up script variables
$mysiteHostUrl = "http://MySite"
$personalSiteGlobalAdminSAM = "DOMAIN\sp_p_admin"
$personalSiteGlobalAdminEmail ="sp_p_admin@iedaddy.com"
$personalSiteGlobalAdminDisplayName = "SharePoint Personal Sites Admin"

#Now load up our objects
$mysite = Get-SPSite $mysiteHostUrl
$context = [Microsoft.Office.Server.ServerContext]::GetContext($mysite)
$upm =  New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($context)
$AllProfiles = $upm.GetEnumerator()

#Loop through all our profiles
foreach($profile in $AllProfiles)
    #Get display namee and SAM account
    $DisplayName = $profile.DisplayName
    $AccountName = $profile[[Microsoft.Office.Server.UserProfiles.PropertyConstants]::AccountName].Value
       #Make sure they are in our DOMAIN
        if($Accountname -like "DOMAIN*")
         #Check if personal site already exists
          if($profile.PersonalSite -eq $Null)
               #Lets go ahead and create the personal account
               write-host "Creating personal site for ", $AccountName
               write-host "Personal site already exists for ", $AccountName

       #Good to add an SPadmin so it makes it easier to check the personal site stuff for role based profiles w/o Farm Admin rights        
       $pweb = $profile.PersonalSite.OpenWeb()
       #Put the padmin into AllUsers list and make them an admin
       $padm= $pweb.AllUsers[$personalSiteGlobalAdminSAM];
       $padm.IsSiteAdmin = $true;
       write-host "Personal Site Admin has assigned"

#All done, remember to dispose of SPSite object!

SharePoint 2010-How to Hide the Ribbon from Anonymous Users and Users without Edit Privileges

In SharePoint 2010 the new ribbon bar (or Fluent UI as they call it) is great when you need to Author a site, but looks rather ugly and “SharePointish” for internet facing sites.  After all who wants to see this at the top of their pages?


As such I often have to remove it from the UI.  I have found that the best way to do this is through the CSS style sheets and a little SPSecurityTrimmedControl magic in the site Master Page.  Here are the steps to quickly and easily remove this bar for people who are not authoring on the site:

  1. Open your SharePoint master page
  2. Locate this line:<div id="s4-ribbonrow" class="s4-pr s4-ribbonrowhidetitle">
  3. Change it to:<div id="s4-ribbonrow" class="s4-pr s4-ribbonrowhidetitle" style="display:none">
  4. Now find the end of the “s4-ribbonrow” tag and add following block right after it:
    <Sharepoint:SPSecurityTrimmedControl ID="SPSecurityTrimmedControl2" runat="server" PermissionsString="AddAndCustomizePages"> 
    <script type="text/javascript"> 
    document.getElementById("s4-ribbonrow").style.display = "block";
  5. Save the new master page and publish

And as simple as that, the ribbon bar is now hidden from all visitors who don’t have the ability to edit pages, including those who are coming in Anonymously.

SharePoint 2010–Allowing Anonymous Users to Answer Surveys

Recently I was asked to create an anonymous survey in SharePoint.  In many ways these are fairly annoying because SharePoint really wants to know who you are when you’re on the site, so you have to be careful about what features you utilize in your anonymous sites.  However, I think I’ve found a pretty straight forward way of addressing the issue as long as the company is comfortable with creating a separate web application that only allows anonymous access and doesn’t try to authenticate.  This site should only be used to house items that they want to have as anonymous access, so I tend to “portal” the functionality from the anonymous web app inside something like an intranet if needed.

First step is to create an anonymous web application for anonymous surveys.  There are several reasons for wanting to use this approach:

  • You have an external facing site where users don’t log in.
  • You have an internal web site and you want to be absolutely certain that user responses to a survey are 100% confidential and anonymous.
  • You want to use only the out-of-box capabilities of the SharePoint survey module.

However, realize that being anonymous does come with some drawbacks:

Multiple page surveys don’t work on an anonymous site

So what exactly does this mean? Well typically longer surveys will contain multiple pages and / or branching logic. In SharePoint when you respond to a survey that has multiple pages, SharePoint automatically saves the state of the survey when you move from one page to the next. It does this so that in case you lose connectivity while completing a long survey, you’ll be able to continue where you left off. It doesn’t really matter whether there might be better ways to implement this; the fact is that SharePoint writes the work-in-progress survey to the survey list when you move to another page in the survey. The problem is that SharePoint cannot write the incomplete survey to the list unless it can associate that survey with a particular user. If the user cannot log in at the point when they move to the next page, they won’t be able to continue with the survey.

Anonymous users can’t save progress

If the user is not logged in (i.e. anonymous) SharePoint will automatically request the user credentials. If the user is using IE (Internet Explorer) and the web site is in the Intranet zone, IE will by default automatically send through the user’s domain credentials and the user will be logged in (defeating the purpose of being anonymous). Alternatively the user will be prompted for credentials and if they don’t enter these credentials, they will be presented with an unauthorized dialog box:


If the user does log in, they will then move from the NewForm.aspx page to the EditForm.aspx page with the survey id as one of the query string parameters (e.g. …/EditForm.aspx?ID6&…). Since the survey is now in edit mode as opposed to new item mode, it stands to reason SharePoint needs to have the user context in order to retrieve that specific survey and to ensure the user can’t retrieve anyone else’s. The Create responses and edit responses that were created by the user setting on the Advanced Settings page of the survey has no mean unless the user is logged in:


If you want to edit the partially complete survey at this point, you simply have to be logged in.

Intranet sites are typically not anonymous

Another caveat is that Intranet sites are almost always not anonymous. To get around this you could set up a separate web site specifically for surveys where users could connect anonymously. This would entail something along these lines:

Create a new web application that supports Windows Authenticated users for survey administrators (e.g. http://survey.admin). This would normally be created in the Default zone. Create a top level site collection and grant specific permissions to the Survey Administrators so they are able to create new surveys on this site. You could further grant other permissions as necessary to any other users that may need to view survey responses or perform other actions on the site.

Extend the web application to another zone / IIS web site (e.g. to the Intranet zone on the http://survey site). Grant access to anonymous users for this zone, and remove access for Windows Authenticated users.


This will prevent users from being able to log into the http://survey web site using their Windows credentials, but survey administrators will still be able to log into the http://survey.admin web site to manage content as this is in a different zone.

Of course you still won’t be able to have multiple page surveys, but users will never be able to log into this site (http://survey) either accidently or automatically.

This method is the same approach you would typically take on an Internet facing site.

Once you have created the anonymous only web application, the next step is to create a survey like you normally would, remembering that features like branching logic will not work.

You could further enhance the user experience to create the perception of having multiple pages by incorporating some clever JavaScript / jQuery and css to show and hide sections on the survey as a user progresses through it. This would likely mean editing the NewForm.aspx page in SharePoint Designer for the specific survey you wanted to do this on. You’d then have to repeat the process for future surveys you wanted this functionality on.

After the survey has been created, you’re still not completely done.  You have to apply a little anonymous permissions magic in order to allow for anonymous users to actually contribute to the survey and fill it out…

Set advanced settings

On this screen, we set ‘Read Access’ to ‘All Responses’, ‘Edit Access’ to ‘Only their own’ and ‘Allow items from this survey to appear in search results’ to ‘No’. Setting read access to ‘All Responses’ seems a bit weird. After all, you will normally not want your anonymous users to see the responses of all other users. We will change this back later, but for some reason, it is necessary to select this option here.

Set Permissions

Next we go to the permissions screen:

The survey by default inherits the permissions from the site, but for anonymous surveys this will not do. By selecting ‘Edit Permissions’ from the ‘Actions’ menu, you can specify specific permissions for this survey.

The Setting menu has now appeared and from it, we choose ‘Anonymous Access’.

In this screen, make sure that the check boxes for ‘Add’ and ‘View’ are selected.

Set advanced settings again

If this is alright, we can go back to the ‘Advanced settings’ screen to set Read Access back to ‘Only their own’.

There, you’re set. The survey can be filled out by all anonymous users, but they cannot see each others entries. The information entered in these forms will be stored in the survey’s library and you can inspect each entry or view aggregated graphs of all entries. So what happened here? We first set the ‘Read Access’ to ‘All Responses’, because only then can we set the Anonymous Access settings (all check boxes will be greyed out otherwise). But after setting the permissions, we can safely turn access to all responses off again. This procedure is a bit odd, but it works.

SharePoint 2010 – Classic Mode vs Claims Based Authentication

Classic Mode: This is nothing but Windows Authentication (NTLM or Kerberos). In the past, If any web application was created with Classic Mode Authentication then you must extend the web application to a new site if you also want to enable forms based authentication (FBA), and each login (NTLM/FBA) was unique.  To configure this application to use Forms Based Authentication you can convert from Classic Mode to Claims Based. But, there is no UI exist for doing this conversion. The only way around is through PowerShell.

$app = get-spwebapplication "URL"

$app.useclaimsauthentication = "True"


if an application is created using Claims based and if you want to convert from Claims based to Classic Mode then it is not possible either through UI or through PowerShell.

Claims Based: In SharePoint 2010 for a web application we can enable both windows, forms authentication. In earlier implementation to do this, we have to create two web applications which has different zones and different authentication. But, with the new claims based authentication a single application can have the capability to configure both windows and forms under single URL. All this is possible because of the authentication framework is built on Microsoft Identify Foundation. And it uses “Geneva” framework to handle this authentication.