Tiny Deathstars of Foulness

Another article up on a site. This one part of a series I’m doing for Office Ninjas, a nifty little site for people who wear a lot of hats in Small Business office environments. Access it here!

Screen Shot 2015-11-24 at 10.30.36 AM

I really enjoy supplementing the work I do on krypted with some of these types of articles. Look for more soon!

November 24th, 2015

Posted In: Articles and Books, Bushel

Tags: , , , ,

Leave a Comment

Before I post the new stencil, let me just show you how it came to be (I needed to do something, which required me to do something else, which in turn caused me to need to create this):


Anyway, here’s the stencil. It’s version .1 so don’t make fun: AWS.gstencil.

To install the stencil, download, extract from the zip and then open. When prompted, click on Move to move it to the Stencils directory.

Screen Shot 2014-06-04 at 10.05.56 PMReopen OmniGraffle and create a new object. Under the list of stencils, select AWS and you’ll see the objects on the right to drag into your doc.

Screen Shot 2014-06-04 at 10.09.04 PM

Good luck writing/documenting/flowcharting!

June 5th, 2014

Posted In: cloud, Network Infrastructure

Tags: , , , , , , , ,

WIndows Server’s ntbackup tools have become easier and easier to use over time. But there’s no more ntbackup. Well, there’s wbadmin, which is very similar. You can still restore data by downloading ntbackups restore tool at 

Windows Backup is now capable of backing up a system with the same ease of use that Apple brought to automated backups with Time Machine and Time Machine Server. In fact, providing access to only a few more options Microsoft’s tools provide access to some pretty nice options, easily configured.

To get started, you’ll first need to install the Windows Backup Role. To do so, use the Add Roles and Features Wizard in Windows Server 2012 to add the Windows Backup role. Once added, open Server Manager and then click on the Tools menu, selecting Windows Server Backup.

Screen Shot 2013-06-08 at 12.02.23 PM

From Windows Server backup, you can enter the name of an Azure account to configure cloud based backups. However, in this walkthrough we’re going to choose local backups, which really for us means to a network share rather than the cloud, although we could back up to a USB drive or some other internal drive as well. Click Local Backup, then click Configure. Click on Backup Schedule… to bring up the Backup Schedule Wizard. At the Getting Started screen, click on the Next button.

Screen Shot 2013-06-08 at 12.02.56 PM

At the Server Backup Configuration screen of the Backup Schedule Wizard, choose whether to back up all the data or perform a custom backup, which allows you to define only certain files to back up. I like to back up all the data for the most part, so we’re going to go with the full server and click Next.

Screen Shot 2013-06-08 at 12.03.11 PM

At the Specify Backup Time screen, choose the appropriate times of the day to back the server up and click on the Next button.

Screen Shot 2013-06-08 at 12.03.52 PM

At the Specify Destination Type screen, choose where you’d like to back your data up to and then click on the Next button. As mentioned, we’re going to back data up to a network share.

Screen Shot 2013-06-08 at 12.04.07 PM

At the Specify Remote Shared Folder screen, provide a path to the network path that you’d like to back your files up to.

Screen Shot 2013-06-08 at 12.05.06 PM

The backups should then be tested and validated before putting a system into long-term production. The command line tool used to manage backups is wbadmin. The wbadmin has the following verbs available to it:

  • enable backup – modifies existing backups or makes new schedules
  • disable backup – disables a backup schedule
  • start backup – starts a one-time backup job
  • stop job – stops running recovery or backup jobs that are currently in progress
  • get versions – shows the details of backups for recovery
  • get items – lists the contents of a backup
  • start recovery – runs a recovery job
  • get disks – shows online disks
  • get virtualmachines – shows Hyper-V VMs
  • start systemstaterecovery – recovers the system state backup from a valid system state backup
  • start systemstatebackup – makes a system state backup
  • delete systemstatebackup – deletes a system state backup
  • delete backup – deletes a backup
  • delete catalog – used if a catalog gets corrupt usually, to delete a catalog of backups
  • restore catalog – only use this option to attempt to fix corrupted catalogs, restores a catalog

Note: In addition to these options, there are even more commands available to Powershell. These are pretty well documented at

So while you will still need a 3rd party tool if you wish to backup to tape or you need very complex features, there’s now a very easy to use tool, that integrates cloud and local storage backups for Windows Server and is just about as easy to manage and configure as Apple’s Time Machine is on OS X or OS X Server.

June 13th, 2013

Posted In: Active Directory, Windows Server

Tags: , , , , , , , ,

Programmatically controlling the cloud is an important part of trying to reign in the chaos of disparate tools that the beancounters make us use these days. Of all the companies out there, Microsoft seems to understand this about as well as anyone and their fine programmers have provided us with a nice set of tools to manage Office 365 accounts, both in a browser (as with most cloud services) and in a shell (which is what we’ll talk about in this article).

This article isn’t really about scripting PowerShell. Instead we’re just looking at a workflow that could be used to script a Student Information System, HRIS solution or another tool that has thousands of users in it to communicate with Microsoft’s 365 cloud offering, providing access to Exchange, Lync, Access, Unified Messaging and of course, minesweeper. Wait, before you get carried away, I still haven’t found a way to access minesweeper through PowerShell… Sorry…

In order to manage Office 365 objects, you will first need to import the MSOnline module (e.g. of cmdlets) and then connect to an account with administrative access to an Office365 environment. To import the cmdlets, use the Import-Module cmdlet, indicating the module to import is MSOnline:

Import-Module MSOnline

The Get-Credential cmdlet informs you what account you are currently signed in as. Once you have imported the appropriate cmdlets, connect to MS Online using the Connect-MsolService cmdlet with no operators, as follows:


You will then be prompted for a valid Live username and password. The Connect-MsolService cmdlet also supports a -Credential operator (Connect-MsolService –Credential) which allows for injecting authentication information into the command in a script. Next, setup a domain using New-MsolDomain along with the -Name operator followed by the name of the domain to use with Office 365:

New-MsolDomain -Name

The output would appear as follows, indicating that the domain is not yet verified:

Name                  Status                       Authentication      Unverified              Managed

Once created, in order to complete that you are authoritative for the domain, build a text record in the DNS for the authoritative name server for the domain. To see what the text record should include, run Get-MsolDomainVerificationDns:

Get-MsolDomainVerificationDns -DomainName -Mode dnstxtrecord

The output would appear as follows:

Label :
Text : MS=ms123456789
Ttl : 3600

Once the domain name shows as verified, you need to confirm it, done using Confirm-MsolDomain:

Confirm-MSolDomain -DomainName

you can create a user within the domain. To see account information, use the Get-MsolUser cmdlet with no operators:


To create an account, use the New-MsolUser cmdlet. This requires four attributes for the account being created: UserPrincipalName, DisplayName, FirstName and LastName. These are operators for the command as follows, creating an account called Charles Edge with a display name of Charles Edge and an email address of

New-MsolUser -UserPrincipalName "" -DisplayName "Charles Edge" -FirstName "Charles" -LastName "Edge"

Other attributes can be included as well, or you can use a csv file to import accounts. Once created, you can use the Set-MSolUserPassword cmdlet to configure a password, identifying the principal with -userPrincipalName and the new password quoted with -NewPassword. I also elected to not make the user change their password at next login (through the web portal users have to reset their password and they’re randomly generated, so this is much more traditionally equivalent to what we’ve done in Active Directory Users and Computers):

Set-MsolUserPassword -userPrincipalName -NewPassword "reamde" -ForceChangePassword False

We can also use Set-MsolPasswordPolicy to change the password policy, although here we’ll use Set-MsolUser for the account so that the password never expires:

Set-MsolUser -UserPrincipalName -PasswordNeverExpires True

Also, you could use Set-MailboxPermission to configure permissions on mailboxes. I’ve also found that Get-MsolAccountSku is helpful to get information about the actual account I’m logged in as and while I’m waiting for a domain to verify that I can use Get-MsolDomain to see the status. Once the domain is accepted, Get-AcceptedDomain shows information about the domain. Set-MsolUserLicense can be used to manage who gets what license.

Finally, all of this could be strung together into a subsystem by any organization to centrally bulk import and manage delegated domains in an Office365 environment. There are going to be certain areas where human intervention is required but overall, most of the process can be automated, and once automated, monitoring the status (e.g. number of accounts, etc) can also be automated, providing a clear and easy strategy for 3rd party toolsets to be integrated with the Office 365 service that Microsoft is providing. It is a new world, this cloud thing, but it sure seems a lot like the old world where we built middleware to do the repetitive parts of our jobs… Just so happens we’re tapping into their infrastructure rather than our own…

November 9th, 2012

Posted In: cloud, Microsoft Exchange Server, Windows Server

Tags: , , , , , , , , , , , , , ,

Google recently decided that it was time to force some other company to buy cloudy dispositioned upstarts, Dropbox and Google also decided that Office365 represented Microsoft being a little too brazen in their attempts to counteract the inroads that Google has made into Microsoft territory. Therefor, Google thumped their chest and gave away 5GB of storage in Google Drive. Google then released a tool that synchronizes data stored on a Google Drive to Macs and Windows systems.

Installing Google Drive is pretty easy. Just browse to Google Docs and Google will tell you that there’s this weird new Google Drive thing you should check out.

Here, click on Download Google Drive for Mac (or Windows if you use Windows). Then agree to give your first born to Google (but don’t worry, they’d never collect on that debt ’cause they’re sworn to do no evil).

Once downloaded, run the installer. You can link directly to your documents now using

The only real question the installer asks is whether you’d like to automatically sync your Google Drive to the computer. I said yes, but if you’ve got a smallish drive you might decide not to. Once the Google Drive application has been downloaded and installed, open it (by default it’s set to open at startup). You’ll then see a icon in the menu bar that looks a little like a recycling symbol. Here, click on Open Google Drive folder.

The folder with your Google Docs then shows up on your desktop. Copy an item in there and it syncs up to Google. It can then easily be shared through the Google Apps web portal and accessed from other systems.

While there are still a number of features that and Dropbox will give you due to the fact that they’re a bit more mature, I’d expect Google Drive to catch up fast. And given that I already have tons of documents in Google Docs, it is nice to have them saved down to my local system. I’m now faced with an interesting new challenge: where to draw the line in my workflow between Google Drive, Dropbox and Not a bad problem to have, really! Given the frustrations of having things strewn all over the place I’ll want to minimize some of the haphazardness I’ve practiced with regards to why I put things in different places in the past. In some cases I need to be able to email to folders, have expiring links or to have extended attributes sync between services, so there are some aspects that are likely to be case-by-case… Overall though, I’m very happy with the version 1 release of Google Drive. I mean, who complains about free stuff!?!?!

May 11th, 2012

Posted In: cloud, Mac OS X

Tags: , , , , , , , ,

Looks like Wave will be gone as of January. From Google:

Dear Wavers,

More than a year ago, we announced that Google Wave would no longer be developed as a separate product. At the time, we committed to maintaining the site at least through to the end of 2010. Today, we are sharing the specific dates for ending this maintenance period and shutting down Wave. As of January 31, 2012, all waves will be read-only, and the Wave service will be turned off on April 30, 2012. You will be able to continue exporting individual waves using the existing PDF export feature until the Google Wave service is turned off. We encourage you to export any important data before April 30, 2012.

If you would like to continue using Wave, there are a number of open source projects, including Apache Wave. There is also an open source project called Walkaround that includes an experimental feature that lets you import all your Waves from Google. This feature will also work until the Wave service is turned off on April 30, 2012.

For more details, please see our help center.

Yours sincerely,

The Wave Team

© 2011 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043
You have received this mandatory email service announcement to update you about important changes to your Google Wave account.

November 23rd, 2011

Posted In: cloud, Social Networking

Tags: , , , , , , ,

Amazon S3 now allows administrators to host simple web sites. Previously, you could host images, videos and other files using S3 buckets, but now you can host full sites. To do so you will need only configure a webroot and some error documents. To get started:

  1. Log into the Amazon S3 Management Console
  2. Right-click on an Amazon S3 bucket
  3. Open the Properties panel
  4. Configure your webroot
  5. Click on the Website tab
  6. Configure error documents in the Website tab
  7. Click Save

Pretty easy, right? But what if you need to configure the php.ini file or add MIME types, etc. Notice that at the start of this I said “simple.” I’m sure more features are to follow, but for now S3 is mostly appropriate for very simplistic sites.

February 19th, 2011

Posted In: cloud

Tags: , , , , ,

Back in November of 2008 I did an article awhile back on a way to use Amazon’s S3 to hook into Final Cut Server. At the time though, S3 had a pretty big limitation in that it wasn’t really suitable for backing up large video files as an archive device for Final Cut Server. But today, Amazon announced that S3 now supports files of up to 5 terabytes using multipart upload (previously the maximum file size was 5 gigabytes).

This finally means that files do not have to be broken up at the file system layer in order to back up to Amazon’s cloud. However, this does not mean that traditional disk-to-disk backup solutions can be leveraged to provide a good target for backups and archives as backups need to be performed using the multipart upload. The ability to now use S3 for large files allows us to finally use Amazon S3 in a way that is much simpler that it was to do so previously, although it is still not as transparent as using a file system or URI path.

Overall, this represents a great step for Amazon and I hope to see even more of this in the future!

December 10th, 2010

Posted In: cloud, Final Cut Server

Tags: , , , is a cloud-based file sharing service that I used extensively in my last book. Similar to, allowed my publishers and I to automate our workflow with regard to the publishing process, but more importantly, I was actually able to do much of the review and exchange of files from the iPad, which was really nice given that the book was on iOS. I’ve been working with a few companies over the past few weeks on coming up with various strategies for cloud interoperability, and has come up a few times in this regard. Looks like I’m not the only one!

November 18th, 2010

Posted In: iPhone

Tags: , , , , , , ,

NAS (Network Attached Storage) devices are a popular alternative to providing centralized file services to smaller environments. This includes devices such as the Seagate BlackArmor, the DroboShare NAS and the Netgear ReadyNAS Pro. These are inexpensive as compared to an actual server, they require less management and they often come with some pretty compelling features. But one of the primary reasons to buy a NAS can end up being a potential pain point as well: they require less management than a server because they can’t do as much as a server can.

For example, the option to replicate between two of them. Most have NAS to NAS replication built in. However, that replication ends up being dependent on having two of them. But what if you just have a machine on the other side of the replication, want to back it up remotely compressed or want to back up to a cloud environment. Well, if it’s not the same daemon then you’re typically stuck with CIFS, NFS, HTTPS (WebDAV) or FTP. The devices don’t typically give you the option to push directly from it nor to run a daemon that non-proprietary device can connect to directly, so you’d have to use a client to do the offsite sync. One example of how to do this would be to use JungleDisk and an Amazon S3 account. JungleDisk would mount the AmazonS3 storage and the NAS storage (all share points). You would then use a tool such as ChronoSync, Retrospect (Duplicate scripts not backup scripts btw) or even rsync to backup the device over CIFS. It’s not pretty, it’s extra latency and management, but it would work.

The reason you would do synchronization is that if you attempt to backup (a la Retrospect Backup Scripts) then you’d send big, monolithic files over the wire. The smaller increments of data you can send over the wire the better. Another tool that can do that type of sync is File Replication Pro. That would actually do blocks instead of files, pushing an even smaller increment of data over the wire. There are certainly other services. You could even open up the firewall (for just the specific ports/IP addresses requiring connectivity, which is always a potential security risk) and have a remote backup service come in and pull the data sync over FTP, CIFS or WebDAV (if you want to stick with a cloud backup solution), but those types of services are a bit more difficult to find.

The same is pretty much the same for cloud based storage. With the exception that instead of a built-in feature you’re either looking for a built-in feature or an API that allows you to develop your own. The moral of this story, if you use a NAS or a cloud-based solution and you want to back your data up, then your options are limited. Keep this in mind when you decide to purchase a NAS rather than, let’s say, a Mac OS X Server running on a Mac Mini with some Direct Attached Storage (DAS) connected to it.

October 27th, 2009

Posted In: Network Infrastructure

Tags: , , , , , , , ,

Next Page »