Thursday, March 3, 2011

Performance Testing Hyper-V

I’ve been reading up on everything Hyper-V, but I’m still very curious as to the specific performance characteristics of different configurations. Specifically (at this stage) I want to know if there is a sweet spot for RAM, and the effect of adding more virtual processors to the guest. I’ll use this information to make better choices when it comes to selecting VM configurations for my SharePoint farm (or whatever else I happen to feel like playing around with)
I’d also like to form a solid baseline for comparison if I make hardware changes etc.

I hopped on to Bing and found the PassMark Performance Test software. It looks like it does everything I need, so I decide to give it a go.

My testing methodology is to spin up each configuration and run the test 4 times. Reboot and run it another 4. I’ll disregard the highest and lowest total score, and average the remaining 4. I’ll run this over a range of configurations and compare the results. I’ll also compare the results to a “max theoretical score”, which is obtained from running the tests on the host machine.
At this stage I am not going to work out standard deviations or significant difference, however, depending on the ‘logic’ of the results I get, I might do this in the future.

I have scripted the test to only run CPU, memory and disk benchmarks (thanks to this forum post - http://www.passmark.com/forum/showthread.php?t=1701). I don’t have a graphics card installed, and I’m not sure that would be valid in an RDP window anyway…
I will dedicate a particular VM to this task, which I will use for nothing else. I plan to revisit the testing a number of times (it would take too long to do all the test I want to do in one go).

I ran tests on the host and 2 different CPU configs for a guest. I was planning to do 1, 2, 3 and 4 CPU configurations, but after doing 1 and 4 – it didn’t seems worth continuing. There was no real difference…

  Host Guest 1 Guest 2
CPU Mark 5523.3 1013.78333 1031.36667
Memory Mark 2109.1 814.266667 767.4
Disk Mark 908.183333 7131.95 6100.25
PassMark Rating 2448.1 826.8 827.583333

image

Host = Phenom x6, 16GB RAM, included reference
Guest 1 = 4 virtual CPU, 4GB static RAM Hyper-V guest
Guest 2 = 1 virtual CPU, 4GB static RAM Hyper-V guest
Value in graph expressed as % vs host result.

My conclusion – either the way Hyper-V schedules CPU time means that there is no difference when the VM load does not tax the host, or my testing software is flawed in a VM. I’m going to need to do some reading, and perhaps retest with multiple VMs at once (say, 3 VMs of the same config running the tests simultaneously). 

Monday, February 28, 2011

Hyper-V Templates for SharePoint 2010 using Sysprep

I’ve decided on a very simple farm for my first attempt at virtualising SharePoint 2010 at home – I’m creating 4 servers.

Server 1 is a domain controller. I’ll reuse this server over and over again. So once it is set-up, I’ll take a copy of the virtual hard drive. In the case I need to start from scratch, I can. I’ll allocate the DC 512meg RAM as this is the Windows Server 2008 R2 minimum spec, and it’s not exactly going to be taxed controlling a domain of 4 servers.

Server 2 is a SQL Server. Again, I’m likely to reuse this server, so I’ll take a copy of the VHD. I’ll allocate 4096meg RAM to this server

Server 3 and 4 are web and app servers respectively. Again, 4096meg RAM each.

Creating multiple servers is actually pretty easy in Hyper-V.
- I started by creating a single server running Windows 2008 R2, and installed all the latest updates, service pack 1, and enabled things like remote desktop.
- Next I shut down the VM I have just created, and logged in to the host and took a back-up of the VHD. This will be useful later when I need to update my base image (or, potentially, install things like a virus scanner by default)
- Next, restart the VM. Once in windows run the sysprep utility. As it turns out, sysprep comes with Windows these days and can be found at C:\Windows\System32\sysprep
- I selected the following options;
image
Selecting generalize ensures a new SSID is created each time the image is used so the machines don’t clash.
- Sysprep will do its thing, and then shutdown the machine. At this point, take a copy of the VHD and mark it as read-only. This is your system template
- I copied the VHD 4 times, one for each server I wanted created. Then I created 4 new VMs, each attached to one of the copied VHDs with the appropriate amount of RAM and fired them up.

There you have it. 4 servers created in the time it takes to install 1 OS Smile.

Sunday, February 27, 2011

Remote Management for Hyper-V

As part of my virtual lab project, I want to be able to manage my virtual machines from my lap-top. It’s usual practice in most organisations to only remote in to servers when absolutely needed, so I’m going to take the same approach at home (may as well replicate a real environment as much as I can!).

First things first, I’m going to need the Hyper-V MMC snap-in for Windows 7. The remote server administration tools (RSAT) can be found on MS here;
http://www.microsoft.com/downloads/en/details.aspx?FamilyID=7d2f6ad7-656b-4313-a005-4e344e43997d&displaylang=en
Once you’ve installed that, you’ll need to enable the hyper-v features using the ‘Turn Windows Features on or off’ link in Programs and Features in control panel.

Next you’ll need to configure the hyper-v role on the server to allow you to administer it remotely; by default, if you add the server to the hyper-v manager in Windows 7, you’ll get an error stating you don’t have permissions. In my set-up, the virtual host and my lap-top are both workgroup computers, any domains I set-up will be at the guest level. Getting it to work when both computers share a domain is straight forward, when they’re in a workgroup, not so much. A quick Google search pointed me to HVRemote, a script written by John Howard. His blog can be found here - http://blogs.technet.com/b/jhoward/ and the script can be found on MSDN here - http://archive.msdn.microsoft.com/HVRemote.

Following the instructions for workgroup-workgroup configuration on the MSDN page – and it works.

Setting up a Virtual Lab for SharePoint 2010

So - having always run virtual machines on my lap-top with the previous version of SharePoint, its 4gb of RAM just isn't enough anymore. I figured I may as well take advantage of rock-bottom hardware prices and set up a virtual lab.

For home use, consumer grade hardware is perfectly capable of hosting virtual machines reliably, so I logged on to Umart to see what was on offer.
After a while, I settled on an AMD based system, as the prices seemed lower for what I wanted to do (raw computing power per dollar was higher). I settled on the following hardware;
AMD Phenom II x6 1055t
Gigabyte GA-870A-UD3 motherboard
Kingmax 8GB (2 x 4GB) DDR3 RAM kit x 2
Western Digital 1TB SATA3 Harddrive x 2

I stuck it all in an Antec NSK6582b case, with an Earthwatts 430w powersupply.

I plan at a later stage on installing a couple of extra hard drives and getting a RAID array going for performance, which the motherboard will support, but that is a battle for another day.

All in all, I've ended up with a 6-core, 16GB RAM machine for under $800. Since the whole thing is for dev and testing, my TechNet subscription will cover any licensing, so I went ahead and installed Windows Server 2008R2 and configured the Hyper-v role.
Now to create a virtual machine template using sysprep... let the games begin!

Saturday, February 27, 2010

Employee On boarding / Off boarding with SharePoint and InfoPath

There’s a few times with recent employers where I have been
asked to help streamline the on-boarding / off-boarding process. In fact, with
my most recent employer it has been priority number 1 for my manager. While the
theory of this is quite simple, in practice it is another beast entirely.

There’s a few key factors in my experience which tend to complicate this task. Primarily the issues lies with an ill-defined business process; most businesses know the ‘general idea’ of
hiring someone, however if you take 2 line managers, the chances are they go
about the process in completely different ways. A secondary concern is often
cost; businesses hate to spend money at the best of times, and processes like
employee on-boarding / off-boarding is often a very low budgetary priority (yet
remains a high priority for the people involved!). It’s very hard to demonstrate the value in improving
the process; tacitly it’s a no brainer, but try to
show that on paper.....

Get the requirements nailed!!

Regardless of which technology ends up being the answer (if
any!!), the process and requirements need to be absolutely set in stone up
front. This becomes especially important if the technology of choice happens to
be InfoPath / SharePoint Designer workflow, however the point remains valid no
matter what the solution. Avoid the temptation to allow a single person to
define the process, as almost universally, different people at different levels
all have different points of view. Additionally, you may get the situation
where an overzealous HR manager overdefines the
process.

My suggestion is to get a good BA involved. You’ll only need
a few hours of their time; however their ability to tease real requirements out of
the business and define wants vs. needs is invaluable. If
possible, get them to run a workshop and have them produce a flow diagram of
the business process too. This makes defining a workflow so much easier (obviously
you can do this yourself, but make sure you have a different ‘hat on’.
Technology should not drive the process)!

Technical specifications

Once the business process has been defined, it should be
fairly straight forward (but time consuming) to build technical specifications.
Regardless of the technology selected to provide the solution (this may not
even be a choice.), good technical specifications are vital. Focus on defining
upfront the entire collection of fields which will be required to be filled by
the user, as well as those that the system will automatically fill. Next, split
these fields out by ‘what should be filled when’ according to the flow diagram
defined by the BA. Avoid any assumptions at this point, if
you’re unclear on anything at all, however minute, have the process owner
define it in writing and ensure it makes sense in the context of the entire
process.

After this, define how the artefact (i.e. on-boarding form)
should be routed at each step. For example, the process may start with a PM
requesting a new resource. Upon request submission the status may automatically
be set to “New”. The resource manager may then process the request and set the
status to one of several choices – these may be something like “Rejected”, “Approved
– existing resource to be provided” and “Approved – new resource to be hired”.
Clearly the next step in the workflow is highly dependent on which status is
selected.

Functional specifications

The next step should focus on defining the process in the
context of the technology selected to deliver the solution. At this point, you
will need to start getting creative; you’ll need to think about the ‘view’ the
user will see at each step, and how you’re going to ensure they see that view
and only that view (Murphy’s law, give them options and someone will stuff it
up). This will get complicated because of the potential for back and forth
between the requester and the resource manager. There are a number of options
at this point, but fundamentally they’re all going to be driven off the value
on the “Request Status” field.

You can use conditional
formatting in InfoPath to define which fields are displayed depending on the
status, or you can use j-query to show/hide selected fields on the SharePoint
item form. Alternatively, you can use SharePoint designer and the data view web
part. Licensing arrangements may dictate your choice (Form Services requires
enterprise licensing), however if available, InfoPath is the most flexible

Develop you functional specs in a very technology specific
way and leave no ambiguity. Ensure you cater for all contingencies. Once
defined, have several independent people sanity check what you have proposed.

Build it

All of the above may seem excessive, however, regardless of
which technology is eventually selected and especially if it is a SPD/InfoPath
solution, changes mid way through can be highly disruptive. SPD workflow and
especially InfoPath forms, have a tendency to become
very unstable if significant changes are made to one or other. Always
develop your InfoPath form first and your SPD workflow second. This is what
makes all of the above so vital, if you do not plan effectively,
you cannot work in this order and as a consequence will end up doing and redoing
settings and configuration time and time again!

Saturday, June 20, 2009

The Browser Should Get Out of the Way!

A former colleague of mine recently showed me a YouTube video which had been posted by Google. They posed the question "What is a browser?" to the general public in New York...

The results are very interesting - most of the respondants do not differentiate between the browser they use, and the applications they browse, with the majority of people having very little idea of the difference. A lot of people actually confused the Google search engine with the browser itself...

I'm sure others will have differing opinions, and I'm not sure this way the point exactly, but to me it says the browser needs to just get out of the way... As far as I'm concerned (and the people in the above video as well...), it's the applications and content ON the internet that I'm interested in - the browser is nothing more than a window into those applications. Loading it up with features, buttons and toolbars doesn't help me, it just gets in the way.

To be honest, I'm not surprised people don't really know what a browser is, because when it comes down to it, why the heck should I care? As long as it shows me the stuff I want, what's the difference?

Thursday, June 18, 2009

Issues Connecting to Analysis Services using PWA

As part of my inheritance of a SharePoint / PWA implementation, I've discovered lots of little bits and pieces which are not working as they should. Overall the implementation was, as far as I can tell, rushed and not particularly well architected.

I came across an issue the other day with some SSAS reports which are supposed to show an overview of all the projects we have on the go – the problem being, it never returns any data, just an error.



Upon enquiring as to why this might be, I was told “Oh, that’s never worked… it works on the server, but not on any clients”. Oh dear…

As it turns out, the reports would show on any of the servers in the farm, but on NO clients. Armed with a very consistent scenario such as this, troubleshooting should be fairly straight forward. The first thing that sprang to mind was authentication and the infamous “double-hop” issue - I had no real evidence of this, but it did provide a reasonable starting point…

After skimming a tech net article on connection issues with SSAS 2005 (http://technet.microsoft.com/en-au/library/cc917670.aspx#ECAA), I attempted to test connectivity from the clients into SSAS. To do this, create an empty text file and rename the .txt extension to .udl. Now when you double click on this udl file, you’ll get the data connection properties window. In the provider tab, select the type of data you’re connecting to (in this case Microsoft OLE DB Provider for Analysis Services 9.0 as we’re connecting to SSAS 2005). In the connection tab, enter the server’s name, and hit “test connection”. A very interesting thing happened; when using integrated authentication, the test failed with a “transport error”, if I specified an account on the SSAS server which I knew had access, the test would succeed. A quick look at the security log on the SSAS server confirmed that using integrated security, a failure audit would be generated and the connection would be blocked.


According to the tech net article, this is usually due to insufficient permissions… However, with my account (or anyone elses), using the server, I could access the reports just fine…

Turns out when PWA / MOSS was implemented, since the implementation wasn’t company wide and since the enterprise doesn’t actually use AD, a brand new AD domain was created for our department, and users were provisioned into this domain as needed. Log-in for the actual client machine the user was accessing MOSS from is completely independent – I’m not a big fan, but it could work, IF done properly. One tiny, but crucial thing was never done though - the domain that MOSS resides in does not trust the domain that users are logged in to. No client can ever access SSAS because their machine can never authenticate…

Creating a workaround is simple – provision a local account on the SSAS box for each user who needs access to the report, which has the same username and password they’re using to log onto their computer. Then add this account to a role in SSAS. Obviously that solution is not security best practice, and would completely unmanageable with a large number of users.
The permanent solution is to create a trust relationship between the 2 domains – this requires the domain admin to create, which is going to take weeks *sigh*, but hence the workaround for the time being. I’m of the opinion that a properly implemented SSO solution would go a long way to solving this too, but fixing SSO is a beast of an issue for another day…