Thursday, March 22, 2012

Setting Exchange Calendar Permissions

The default permissions for all users to another user's calendar is AvailabilityOnly. This permission allows you to see that a user is busy, but it does not allow you to see the subject or other information. This permission is just enough that you can use free/busy searching for meeting requests.

In some cases, you might want to make the subject available. Specifically, for resource mailboxes, it may be useful. In the default configuration, meetings in the calendar of a resource mailbox contain the name of the person that organized the meeting, but not the original subject of the meeting. By allowing users to see the subject of meetings for a resource mailbox, you are allowing them to see who organized the meeting.

The permission to use for showing meeting subjects is LimitedDetails. The command to set this permission on a resource mailbox named Room100 is shown below:
Set-MailboxFolderPermission Room100:\Calendar -User Default -AccessRights LimitedDetails

Creating a Library of PowerShell Functions

If you are doing a large amount of PowerShell scripting then it is likely you have created functions that you want to reuse between scripts. You can copy and paste a function from one script to another to reuse it, but this creates maintenance issues. If you modify the function to improve it, then you need to update many scripts.

To centralize management of functions, you can create a single PowerShell script that contains all of the functions. Then you load the script containing the functions from within other scripts before you call the functions. To load the functions script, use dot sourcing like this:
. .\functions.ps1

The example above loads functions.ps1 from the same directory as the script that you ran. You can also specify a full path such as:
. C:\functions.ps1

You can also use a variable as part of the path:
. $ScriptPath\functions.ps1
Another alternative is to create multiple PowerShell scripts that you call as needed from within your script. Effectively, you would be treating each PowerShell script like it was a function. I think a single script with functions in it is more graceful.

If you create a central library of functions be sure to understand how all of the scripts use that function. A change that enhances one script may break another one.

Wednesday, March 21, 2012

Enable PowerShell Remoting

PowerShell 2.0 introduces PowerShell remoting. PowerShell remoting allows you to run cmdlets or scripts on a remote server. For example, you can have one central server that hosts scripts. From the console of that server, you can then run the locally stored script on multiple servers.

In the words of the man who taught me that instructors don't need to read from a book (Bob Vergidis): What does this buy me?

Using PowerShell remoting buys you:
  • The ability to centralize maintenance of your scripts
  • A central location to schedule scripts to run as tasks
  • The ability to create a script that performs actions on multiple servers even if the cmdlets do not have an option to specify a remote computer
In order to use PowerShell remoting, three things need to be in place:
  1. Configure the WinRM service to create a listener
  2. Make sure that the WinRM starts automatically
  3. Configure the firewall to allow communication with WinRM (port 5985)
The easiest way to enable PowerShell remoting on a specific computer is to run the PowerShell cmdlet:
Enable-PSRemoting

In small environments with only a few servers, manually enabling PowerShell remoting one-by-one works well. It takes only a few seconds per server. In larger environments, you will probably want to use Group Policy to push out the necessary configuration changes. This document describes how to create a GPO that enables PowerShell remoting:

Monday, March 19, 2012

Why S/MIME Sucks

Let's start with a brief explanation of S/MIME....

S/MIME is a method used to encrypt and digitally sign email messages. Encryption  prevents unauthorized users from reading the message. A digital signature ensures that the message was sent by an identified person.

To implement S/MIME, both the send and receiver must have digital certificates. Each certificate has a public key and a private key. For the process to work properly between User A and User B, each user needs to have a copy of the other's public key. For example, User A needs to have a copy of User B's public key.

The certificates for S/MIME can be generated internally by an IT department if a certification authority is configured. Alternatively, you can buy certificates from a number of providers for $10-$15 each. The providers that sell certificates verify your identity so that they are trusted by external recipients. The one bit of good news is that you can get a free personal certificate for S/MIME from http://cert.startcom.org/.

Why S/MIME Sucks
One of our clients got a message last week from a bank. The bank was sending confidential information and wanted to encrypt it. The bank uses some sort of S/MIME gateway and our client got a message indicating that they need to respond back with their .p7b file (the public key) to allow the encrypted message to be sent.

There are a couple of problems here:
  1. The end user has no idea what to do with this.
  2. The end user does not have a certificate for S/MIME
After obtaining a certificate, we sent a message back with a digital signature (to attach the .p7b file), but as of yet, it's still not coming through.

A Better Alternative to S/MIME
There are a number of providers that provide secure delivery of mail messages based on a web site. When the secure message is sent, instead of encrypting the message and sending it, the recipient gets a message with a link to the secure location. It avoids the need to set up certificates on each client.


PowerShell Error Colors

Like many of my posts, this falls in the category of neat, but I'm not sure very useful to anyone else. Today in my PowerShell class, students were finding it difficult to read the errors displayed by the projector. The room was fairly bright and read was difficult to read. To fix it, I changed the colors with the following code:
$colors=(get-host).privatedata
$colors.ErrorBackgroundColor="white"
$colors.ErrorForegroundColor="black"

If you want to display all of the options that you can set, use the following code:
$colors=(get-host).privatedata
$colors

Saturday, March 10, 2012

ASUS Drivers and OS Versions

Let me say, that I should have known better than to buy white box. It's just never as smooth as a brand name box that supports the specific OS that you want. In this case, my excuse was I was in a hurry an needed hardware sooner rather than later. However, the install process on this thing cost me about half a day.

On the off chance that anyone else struggles with this, here is my story.....

Hardware: ASUS P9X79 Pro motherboard

The good news is that this motherboard supports up to 64 GB of RAM which was my primary concern for this test box. Under normal circumstances, I'm not worried about drivers because Windows 7 drivers work in Windows Server 2008 R2 SP1 (the OS I'm using for Hyper-V and virtual machines).

My main problem was actually the way that ASUS distributes drivers. They have a special AsusSetup utility with an INI file that defines which files to use for various operating systems. I'm guessing that it works well for most people, but AsusSetup was detecting my OS as WNT_6.1I_64 which is not in their INI file. I tried running the setup file in compatibility mode, but it then detected another version of OS which as not supported.

So, there are two ways to work around this:
  1. Edit the INI files to and add WNT_6.1I_64 = Win7_64. This works, but there are a lot of INI files to edit as one AsusSetup, runs another in a lower part of the directory structure.
  2. Browse down far enough and you'll find the actual setup file that is run to install the driver. At this point, I've satisfied myself with installing the chipset drive which resolved most of the unknown devices in Device Manager.
I had also looked to see if WNT_6.1I_64 was an environment variable I could change, but it was not.

Unfortunately the network card on this motherboard, the Intel 82579V, is not officially supported by Intel. So, the drivers do not find the card in Windows Server 2008 R2 (even if you download from Intel). However, if you manually update the device driver for the network card in Device Manager, you can browse to the PRO1000\Winx64\NDIS62 folder, select e1c62x64.inf, and select the Intel 82579LM driver from the list. Not the correct driver, but it seems to work. If you look in the inf file it lists both the 82579LM and 82579V card descriptions. I'm guessing that it is primarily a difference of which advanced options are supported.

A better long term strategy would be to get a PCIe network card (no PCI slots in this one), but I need this up and running today.

While I did figure out the INI file entry bit on my own, I did not figure out the NIC fix on my own. It was originally posted by stephanvg on this thread:


Loading Storage Drivers During OS Installation

I recently cheaped out and bought a whitebox server to use for my virtual machine test environment. During installation of Windows 2008 R2, it needed the RAID drivers. I didn't put a lot of time into testing this, but most of the time the drivers need to be on a floppy disk (at least they did for earlier operations systems). The motherboard documentation said I needed to get a USB floppy drive.

My alternative was to go into the BIOS of the new system, go into the USB settings, and force the attached USB drive with the storage drivers to be seen as a floppy. Normally, the USB storage autodetects, and a large drive like a 8 GB USB drive will be seen as mass storage.

I was able to browse to the drivers of the USB drive and all was good.