It was a dark and stormy night. Database User Fred had an idea to venture into tables he wasn’t supposed to be in…OK, really that was for my buddy Noel McKinney (Twitter | Blog). At some point when you are writing, the single largest barrier is the first line. Thanks to Noel, I have my first line and we can get to it… Prior to SQL Server 2005, Schemas were not much more than a form of a database user. Schemas since the SQL Server 2005 release are now a form of containment in which grouping of objects can be performed. This is beneficial to security because we can create inner secured groupings of objects in a single database and quickly move objects contained in that group to other areas. The value in this for a DBA is seen in controlling their own work. Best practice tells us we should create a user database that we use for maintenance and other administrative tasks on SQL Server. These databases can quickly become littered with tables and their exact purposes hard to manage. Using Schemas, DBAs can group the tables into meaningful groups, such as index tables in an IndexMaint schema and DMV collections into a Baseline schema. Once this is set up, the tables are identified quickly by the schema they are contained in.
This is an archive of the posts published to LessThanDot from 2008 to 2018, over a decade of useful content. While we're no longer adding new content, we still receive a lot of visitors and wanted to make sure the content didn't disappear forever.
I was running some tests on compression for row and page level, and noticed something interesting. All tables that contained a uniqueidentifier column actually resulted in compressing sample sizes larger than the uncompressed size on the respected indexes that contained the uniqueidentifier. This made sense given the precision of a uniqueidentifier but wanted to test that and also make sure it was in writing before stating the fact. Estimate Compression ROW and PAGE level
Defaults surround us in SQL Server. This is both a good thing and a bad thing. Part of the SQL Server installation process is the choice of mixed mode security. Mixed mode in SQL Server means both SQL Authentication and Windows Authentication can be used. By default, SQL Server has Windows Authentication selected. Changing this default to mixed mode should be taken seriously. Mixed Mode vs. Windows Authentication When operating in Mixed Mode, both Windows Authentication and SQL Authentication can be utilized for connections to SQL Server. Sometimes the choice is not up to us on using one or the other. Applications that are built to use SQL Authentication force mixed mode over the Microsoft recommended, Windows Authentication. Windows Authentication is deemed more secure due to no password validation at the SQL Server level. This is all handled in Windows and the principal token. With SQL Authentication the passwords are validated and held on SQL Server.
Several times I found myself wanting to try out different features of TFS 2010, but my only options appeared to be building a virtual machine (time consuming and requires gobs of money or an MSDN/TechNet subscription) or using a live server as a guinea pig (for some reason admins frown on this). Luckily there’s an easier option, I just wasn’t aware of it until last week. Microsoft has virtual machine images for a full TFS 2010 server installation, along with sample labs if you’re just getting started with TFS.
I recently took a three-day class on Windows Server 2008 R2: Deploying and Managing Failover Clustering. My clustering experience: At my previous job, I had supported SQL Server 2005 on a Windows Server 2003 cluster for a couple of years. I had tried to install a Windows Server 2008 cluster last year (and failed miserably, because I didn’t know anything). With my current job, I support several instances of SQL Server 2008 on Windows Server 2008 and SQL Server 2005 on Windows Server 2003.
SQL Server performs best when paired with hardware and an OS configured only for the core functionality of the SQL Server Engine. Installing other features along with the engine can degrade performance and the engine’s ability to process transactions. With this knowledge, why would anyone ever want to install one of the SQL Server features with the core engine? This usually comes down to cost. Cost considerations when installing features of SQL Server
You win some, you lose some. I just took my first Microsoft exam, the 70-448, that covers the BI suite of Analysis Services, Integration Services and Reporting Services for SQL Server 2008. Wow. The test was harder than I expected. Although plenty of time was allotted for taking the exam, some questions included material I hadn’t seen before, even though I read Microsoft’s Self-Paced Training Kit for the exam, cover to cover. No, I didn’t pass.
Recently I had to find a BI resource to do some ETL work for me. I wanted to make sure I had the right person for the job so I came up with a list of interview questions that proved out to be spot on. It really helped me weed out the folks that knew a lot of buzz words versus the folks that had actually worked with the different aspects of the BI ETL process. I’ve provided some of the answers to a few of the questions. You would be surprised at what little people really know when they claim to be an expert at BI. I did find the perfect person using these questions and the project is well under way and on target!
Introduction Nuget has been around for a while now, but what a program like this needs is content, content and more content. And Nuget now has plenty of that. SO if you use opensource libraries alot and you are not yet using Nuget then do it now. How it helps Now when I set up a testproject I use nuget. All my favorite packages are there and it takes seconds to install them. But I can also create my own package that has a dependency on all the packages I want and install everything in one go.
It seems to me that there are an enormous number of religious battles being fought in the IT-world. The fight between apple and MS fanboys and girls. The battle between .Net and Java. The battle between SQL and NoSQl. The battle between C#and VB.Net. And it seems that if there is no battle they will create one. The Anti-if movement, the craftsmanship movement. All this is of course not unknown for the human race. It seems that the human race thrives upon conflict and just needs it to make life worthwhile. The urge to always know better and to always be the one at the top is baked into to our souls. Is it an animal instinct so many of us still need to satisfy? I don’t know but I dearly hope so. Will we ever get past this? Will we ever have a more perfect universe like Gene Roddenberry thought we would have?