Category Archives: cloud

cloud Mac Security Network Infrastructure

Configure Syslog Options on a Meraki

Meraki has a syslog option. To configure a Meraki to push logs to a syslog server, open your Meraki Dashboard and click on a device. From there, click on “Alerts & administration”.

Screen Shot 2014-04-12 at 8.29.16 PM

At the “Alerts & administration” page scroll down to the Logging section. Click on the “Add a syslog server” link and type the IP address of your syslog servers name or IP. Put the port number into the Port field. Choose what types of events to export. This could be Event Log, Flows or URLs, where:

  • Event Log: The messages from the dashboard under Monitor > Event log.
  • Flows: Inbound and outbound traffic flows generate syslog messages that include the source and destination and port numbers.
  • URL: HTTP GET requests generate syslog entries.

Note that you can direct each type of traffic to a different syslog server.

Active Directory cloud Consulting iPhone Kerio Mac OS X Mac OS X Server Mac Security Mass Deployment Microsoft Exchange Server Network Infrastructure Windows Server

Dig TTL While Preparing For A Migration

Any time doing a migration of data from one IP to another where that data has a DNS record that points users towards the data, we need to keep the amount of time it takes to repoint the record to a minimum. To see the TTL of a given record, let’s run dig using +trace, +nocmd to turn off showing the version and query options, +noall to turn off display flags, +answer to still show the answer section of my reponse and most importantly for these purposes +ttlid to toggle showing the TTL on. Here, we’ll use these to lookup the TTL for the A record:

dig +trace +nocmd +noall +answer +ttlid a

The output follows the CNAME (as many a www record happen to be) to the A record and shows the TTL value (3600) for each: 3600 IN CNAME 3600 IN A

We can also lookup the MX using the same structure, just swapping out the a for an MX and the FQDN with just the domain name itself:

dig +trace +nocmd +noall +answer +ttlid mx

The response is a similar output where 3600 IN MX 0 3600 IN MX 10

cloud Mac OS X

One Liner To Install gcloud for Managing App Engine Instances

I had previously been using the gcutil command. But I cheated a little with the one liner promise to get the new tool, gcloud, installed:

curl | bash ; unzip ; ./google-cloud-sdk/

The installation shell script is interactive and will ask if you want to update your bash profile. Once run, kill your terminal app and the new invocation will allow you to log into App Engine using the gcloud command followed by auth and then login:

gcloud auth login

Provided you’re logged into Google using your default browser, you’ll then be prompted to Accept the federation. Click Accept.

Screen Shot 2014-01-03 at 11.14.21 PM

The gcloud command can then be used to check your account name:

gcloud config list

To then set a project as active to manage it, use the set option (or unset to not manage it any longer:

gcloud config set project kryptedmuncas

You can then use components, sql or interactive verbs to connect to and manage instances. Each of these commands are interfacing with the API, so if you ever find that you’ve exceeded what this simple command provides for, you can always hit the API directly as well. I found that the interactive command was my favorite as I could figure out what limitations I had using interactive and then try and figure out how to accomplish tasks with commands from there.

cloud Network Infrastructure SQL Ubuntu Unix VMware Windows Server

Scripting Azure On A Mac

Microsoft Azure is Microsoft’s cloud services. Azure can host virtual machines and act as a location to store files. However, Azure can do much more as well, providing an Active Directory instance, provide SQL database access, work with hosted Visual Studio, host web sites or provide BizTalk services. All of these can be managed at


You can also manage Windows Azure from the command line on Linux, Windows or Mac. To download command line tools, visit Once downloaded, run the package installer.

Screen Shot 2013-11-29 at 10.51.01 PMWhen the package is finished installing, visit /usr/local/bin where you’ll find the azure binary. Once installed, you’ll need to configure your account from the site to work with your computer. To do so, log into the portal.

Screen Shot 2013-12-01 at 8.25.57 PM

Once logged in, open Terminal and then use the azure command along with the account option and the download verb:

azure account download

This account downloads the .publishsettings file for the account you’re logged in as in your browser. Once downloaded, run azure with the account option and the import verb, dragging the path to your .publishsettings file from

azure account import /Users/krypted/Downloads/WindowsAzure-credentials.publishsettings

The account import then completes and your user is imported into azure. Once imported, run azure with the account option and then storage list:

azure account storage list

You might not have any storage configured yet, but at this point you should see the following to indicate that the account is working:

info: No storage accounts defined
info: account storage list command OK

You can also run the azure command by itself to see some neat ascii-art (although the azure logo doesn’t really come through in this spiffy cut and paste job):

info: _ _____ _ ___ ___________________
info:        /_\  |__ / | | | _ \ __|
info: _ ___ / _ \__/ /| |_| |   / _|___ _ _
info: (___ /_/ \_\/___|\___/|_|_\___| _____)
info: (_______ _ _) _ ______ _)_ _
info: (______________ _ ) (___ _ _)
info: Windows Azure: Microsoft's Cloud Platform
info: Tool version 0.7.4
help: Display help for a given command
help: help [options] [command]
help: Open the portal in a browser
help: portal [options]
help: Commands:
help: account to manage your account information and publish settings
help: config Commands to manage your local settings
help: hdinsight Commands to manage your HDInsight accounts
help: mobile Commands to manage your Mobile Services
help: network Commands to manage your Networks
help: sb Commands to manage your Service Bus configuration
help: service Commands to manage your Cloud Services
help: site Commands to manage your Web Sites
help: sql Commands to manage your SQL Server accounts
help: storage Commands to manage your Storage objects
help: vm Commands to manage your Virtual Machines
help: Options:
help: -h, --help output usage information
help: -v, --version output the application version

Provided the account is working, you can then use the account, config, hdinsight, mobile, network, sb, service, site, sql, storage or vm options. Each of these can be invoked along with a -h option to show a help page. For example, to see a help page for service:

azure service -h

You can spin up resources including sites, storage containers and even virtual machines (although you might need to create templates for VMs first). As an example, let’s create a new site using the git template:

azure site create --git

Overall, there are a lot of options available in the azure command line interface. The web interface is very simple, with options in the command line interface mirroring the options in the web interface. Running and therefore scripting around these commands is straight forward. I wrote up some Amazon stuff previously at, but the azure controls are really full featured and I’m really becoming a huge fan of the service itself the more I use it (which likely means I’ll post more articles on it soon).

cloud Ubuntu Unix

25 Helpful Chrome OS Shell (crosh) Commands

To open Crosh:


Find commands:


Find debugging commands:


To switch to a more bash-like command prompt:


To see the version of Chrome OS running on your Chromebook:

sudo /opt/google/chrome/chrome –version

To show the operating system name:

uname -a

If the operating system is a bit old, update it using the update_engine_client command:

update_engine_client -update

To see the bios of your Chromebook, open up a command prompt (Control-Alt-T) and use the following command:

sudo /usr/sbin/chromeos-firmwareupdate -V

To record some sound from the microphone, use the sound command:


Look for (or grep for) BIOS version in the output.

To see the Vital Product Data, or configuration information such as time zone, UUID, IMEI, model, region, language, keyboard layout and serial number:

sudo dump_vpd_log --full --stdout

Or to be specific about what you’re looking for, grep for it:

sudo dump_vpd_log --full --stdout | grep "serial_number"

To capture some logs for debugging, use systrace:

sudo systrace

To manage the mouse and keyboard acceleration and autorepeat options, use the xset command:

xset m

Trace a network path (like traceroute or tracert):


To run standard network diagnostics:


To capture some packets while troubleshooting network connections, use the packet_capture command:


Check the type, version, etc on your touchpad:

tpcontrol status

You can also debug network connections by logging data going through either the wifi, cellular or ethernet interface using the network_logging command. To do so for a normal 802.11 connection:

network_logging wifi

To configure WAP information:


Accept an SSL Cert by using the enterprise_ca_approve command:

enterprise_ca_approve --allow-self-signed

Many standard Linux commands work as well, including route, mount, cat, cp, chmod, reboot, echo, tr, cut, mkdir, see, if/then, ls, cd, pwd, su, sudo, etc. To see IP address information:

sudo ifconfig eth0

To see all of the running processes:


To see a user’s hid and gid, use id:


To see more information To ping Google:


To connect to another system, you can use ssh and there’s an ssh_forget_host command to clear a given host from your hosts list.

To see a list of the commands you’ve run:


Finally, to close the command prompt:



Factory Reset (Powerwash) Chromebooks

If you ever loose track of the password on your Chromebook, find that the Chromebook is running oddly or want to sell a Chromebook, you can remove your Google account and readd it. The easiest way to do this is a feature called Powerwash. To pull it up, open Settings and then click on Advanced Settings. There, you’ll see the Powerwash button. Click it and then you will remove all of the user accounts installed on the device, basically performing a factory reset.

Powerwash can also be run by clicking Restart while holding down Control-Alt-Shift-R at the login screen. This brings up a Powerwash prompt where you simply need to click Powerwash to remove your data. The first time you login once the Powerwash process is complete, your apps and data start to sync back to the Chromebook.

cloud FileMaker Mac OS X Mac OS X Server Mac Security Mass Deployment Network Infrastructure Time Machine Xsan

Obtain Information From Watchman Monitoring Using a Script

Watchman Monitoring is a tool used to monitor computers. I’ve noticed recently that there’s a lot of traffic on the Watchman Monitoring email list that shows people want a great little (and by little I mean inexpensive from a compute time standpoint) monitoring tool to become a RMM (Remote Management and Monitoring) tool. The difference here is in “Management.” Many of us actually don’t want a monitoring tool to become a management tool unless we are very deliberate about what we do with it. For example, that script that takes a machine name of ‘rm -Rf /’ that some ironic hipster of a user decided to name their hard drive because, well, they can – well that script that was just supposed to run a fix permissions because that ironic jackass of a user in his v-neck with his funny hat and unkempt beard just accidentally cross-site script attacked himself and he’s now crying out of his otherwise brusque no-lense having glasses and you’re now liable for his data loss because you didn’t sanitize that computer name variable before you sent it to some script.

Since we don’t want the scurrilous attention of hipsters everywhere throwing caustic gazes at us, we’ll all continue using a standard patch management system like Casper, Absolute, Munki, FileWave, etc. Many organizations can still take value out of using Watchman Monitoring (and tools like Watchman) to trigger scripted events in their environment.

Now, before I do this I want to make something clear. I’m just showing a very basic thing here. I am assuming that people would build some middleware around something a little more complicated than curl, but given that this is a quick and dirty article, curl’s all I’m using for examples. I’m also not giving up my API key as that would be silly. Therefore, if I were using a script, I’d have two variables in here. The first would be $MACHINEID, the client/computer ID you would see in Watchman. This would be what you see in red here, when looking at an actual computer.

Screen Shot 2013-07-03 at 9.35.54 AM

The second variable is my API token. This is a special ID that you are provided from our friends at Watchman. Unless you’re very serious about building some scripts or middleware like right now, rather than bug them for it, give it a little while and it will be available in your portal. I’ve given the token $APITOKEN as my variable there.

The API, like many these days is json. This doesn’t send entire databases or even queries, but instead an expression of each variable. So, to see all of the available variables for our machine ID, we’re going to use curl (I like to add -i to see my headers) and do the following lookup:

curl -i$MACHINEID.json?auth_token=$APITOKEN

This is going to spit out a bunch of information, parsed with a comma, whereas each variable and then the contents of that variable are stored in quoted text. To delimit my results, I’m simply going to awk for a given position (using comma as my delimiter instead of the default space). In this case, machine name is what I’m after:

curl -i$MACHINEID.json?auth_token=$APITOKEN | awk -F"," '{ print $4}'

And there you go. It’s that easy. Great work by the Watchman team in making such an easy to use and standards compliant API. Because of how common json is I think integrating a number of other tools with this (kinda’ like the opposite of the Bomgar implementation they already have) is very straight forward and should allow for serious automation for those out there that are asking for it. For example, it would be very easy to say take this output and weaponize it to clear caches before bugging you:

“plugin_id”:1237,”plugin_name”:”Check Root Capacity”,”service_exit_details”:”[2013-07-01] WARNING:  92% (276GB of 297GB) exceeds the 90% usage threshold set on the root volume by about 8 GB.”

Overall, I love it when I have one more toy to play with. You can automatically inject information into asset management systems, trigger events in other systems and if need be, allow the disillusioned youth the ability to erase their own hard drives!

Articles and Books Business cloud iPhone Mac OS X Mac OS X Server Mac Security Mass Deployment

Apple's Customer Facing SystemStatus

Apple now has a new system status page for their services, available at This site goes through many of Apple’s services and shows an indicator light for when they are up. Additionally, you can scroll down to the detailed timeline and see a historical account of what services are online.

This is yet another step in Apple’s continued progress at providing more and more information to the community on, well, everything. This includes seeing Apple popping up at conferences here and there, most notably at Black Hat this year, publishing more kbase articles that detail problems and allowing more community involvement from some employees. A more open Apple is a more enterprise, education and consumer friendly Apple.

cloud Microsoft Exchange Server Windows Server

Managing Office 365 Users Using PowerShell

Programmatically controlling the cloud is an important part of trying to reign in the chaos of disparate tools that the beancounters make us use these days. Of all the companies out there, Microsoft seems to understand this about as well as anyone and their fine programmers have provided us with a nice set of tools to manage Office 365 accounts, both in a browser (as with most cloud services) and in a shell (which is what we’ll talk about in this article).

This article isn’t really about scripting PowerShell. Instead we’re just looking at a workflow that could be used to script a Student Information System, HRIS solution or another tool that has thousands of users in it to communicate with Microsoft’s 365 cloud offering, providing access to Exchange, Lync, Access, Unified Messaging and of course, minesweeper. Wait, before you get carried away, I still haven’t found a way to access minesweeper through PowerShell… Sorry…

In order to manage Office 365 objects, you will first need to import the MSOnline module (e.g. of cmdlets) and then connect to an account with administrative access to an Office365 environment. To import the cmdlets, use the Import-Module cmdlet, indicating the module to import is MSOnline:

Import-Module MSOnline

The Get-Credential cmdlet informs you what account you are currently signed in as. Once you have imported the appropriate cmdlets, connect to MS Online using the Connect-MsolService cmdlet with no operators, as follows:


You will then be prompted for a valid Live username and password. The Connect-MsolService cmdlet also supports a -Credential operator (Connect-MsolService –Credential) which allows for injecting authentication information into the command in a script. Next, setup a domain using New-MsolDomain along with the -Name operator followed by the name of the domain to use with Office 365:

New-MsolDomain -Name

The output would appear as follows, indicating that the domain is not yet verified:

Name                  Status                       Authentication      Unverified              Managed

Once created, in order to complete that you are authoritative for the domain, build a text record in the DNS for the authoritative name server for the domain. To see what the text record should include, run Get-MsolDomainVerificationDns:

Get-MsolDomainVerificationDns -DomainName -Mode dnstxtrecord

The output would appear as follows:

Label :
Text : MS=ms123456789
Ttl : 3600

Once the domain name shows as verified, you need to confirm it, done using Confirm-MsolDomain:

Confirm-MSolDomain -DomainName

you can create a user within the domain. To see account information, use the Get-MsolUser cmdlet with no operators:


To create an account, use the New-MsolUser cmdlet. This requires four attributes for the account being created: UserPrincipalName, DisplayName, FirstName and LastName. These are operators for the command as follows, creating an account called Charles Edge with a display name of Charles Edge and an email address of

New-MsolUser -UserPrincipalName "" -DisplayName "Charles Edge" -FirstName "Charles" -LastName "Edge"

Other attributes can be included as well, or you can use a csv file to import accounts. Once created, you can use the Set-MSolUserPassword cmdlet to configure a password, identifying the principal with -userPrincipalName and the new password quoted with -NewPassword. I also elected to not make the user change their password at next login (through the web portal users have to reset their password and they’re randomly generated, so this is much more traditionally equivalent to what we’ve done in Active Directory Users and Computers):

Set-MsolUserPassword -userPrincipalName -NewPassword "reamde" -ForceChangePassword False

We can also use Set-MsolPasswordPolicy to change the password policy, although here we’ll use Set-MsolUser for the account so that the password never expires:

Set-MsolUser -UserPrincipalName -PasswordNeverExpires True

Also, you could use Set-MailboxPermission to configure permissions on mailboxes. I’ve also found that Get-MsolAccountSku is helpful to get information about the actual account I’m logged in as and while I’m waiting for a domain to verify that I can use Get-MsolDomain to see the status. Once the domain is accepted, Get-AcceptedDomain shows information about the domain. Set-MsolUserLicense can be used to manage who gets what license.

Finally, all of this could be strung together into a subsystem by any organization to centrally bulk import and manage delegated domains in an Office365 environment. There are going to be certain areas where human intervention is required but overall, most of the process can be automated, and once automated, monitoring the status (e.g. number of accounts, etc) can also be automated, providing a clear and easy strategy for 3rd party toolsets to be integrated with the Office 365 service that Microsoft is providing. It is a new world, this cloud thing, but it sure seems a lot like the old world where we built middleware to do the repetitive parts of our jobs… Just so happens we’re tapping into their infrastructure rather than our own…

cloud iPhone Mac OS X Mac OS X Server Mass Deployment MobileMe

iWork Public Beta Goes Bye-Bye Today :: Last Call

I’m sure you’ve heard by now. But just in case you hadn’t logged into in awhile or let the to-do lapse, it’s just worth a reminder that iWork Public Beta, the site that you could upload Pages, Numbers and Keynotes to, is being deprecated. The end comes on today.

In other words, if you have documents up on the site, you should download them immediately or you won’t be able to come August. Apple has even provided a document explaining how.

The service that was being provided by the iWork public beta is replaced by iCloud. Using iCloud, you can sync your documents between all of your devices. When you configure iCloud in System Preferences, you are prompted to sync contacts, calendars and bookmarks, but iCloud also gets configured for file synchronization as well at that time. While iCloud doesn’t allow you to edit documents online, you can access them through the iCloud web portal and download them from any computer you like. The new iCloud integration also allows for seeing all your documents in each supported app, when first opened: