krypted.com

Tiny Deathstars of Foulness

Migrating from macOS Mail Server is going to be one of the stranger migrations you might do. Why? Unless you’re moving to basically a custom build of the same tools used in macOS Server (which you’d do by forklifting /Library/Server/Mail/ into a postfix environment and putting the various components Apple changed at compile-time back together), the process for moving to a modern system is going to rely on IMAP and look a little like this:
  • Get a list of accounts
  • Provide the password for each account
  • Setup an initial sync of mailbox contents
  • Look for errors
  • On the day that you cut MX records, do another sync
  • On the day that you cut MX records, migrate local accounts
  • Do a final sync
  • Archive the spam account
  • Take the server offline
You can do this with less effort (e.g. users need to backup their mailboxes, do the sync once, etc), but in my experience the above process has produced the best result for the consumers of mail services and for customers of various types of consultancies. The technical portion of this is pretty straight forward if you follow these steps. The part I like the least is the fact that whosoever has access to those passwords has access to mailboxes, and your actions during that time are very much open to interpretation.

Let’s start by looking at the domains running on the mail server. We’ll do this with serveradmin, by looking at the settings of the mail:postfix:domains:_array_index:

sudo /Applications/Server.app/Contents/ServerRoot/usr/sbin/serveradmin settings mail:postfix:domains:_array_index:

The return will list all of the domains running on the server:

mail:postfix:domains:_array_index:0:name = "krypted.com"

mail:postfix:domains:_array_index:1:name = "kryptedadmin.com"

The primary domain, or the one in _array_index:0 should map to the mydomain variable in /Library/Server/Mail/Config/postfix/main.cf. All of the users will be stored in /Library/Server/Mail/Config/postfix/virtual_users. To see them, simply cat that file:

cat /Library/Server/Mail/Config/postfix/virtual_users

Which would return a line similar to this one for each email account:

charles.edge@krypted.com charles.edge

To just see a list of email address, you could run:

sudo cat /Library/Server/Mail/Config/postfix/virtual_users | grep -E -o "\b[a-zA-Z0-9.-]+@[a-zA-Z0-9.-]+\.[a-zA-Z0-9.-]+\b"

Now that you have a list of email address, you can easily put them into a file that will sync mailboxes. There are a number of tools you could use to migrate actual mail. These include:
Now, let’s look at using imapsync to actually sync a mailbox. In the most basic form, you could just do:

imapsync --host1 oldmail.krypted.com --user1 charles.edge --password1 mypassword --host2 newmail.krypted.com --user2 charles.edge --password2 mynewpassword

For a small set of users you could easily just paste this command into a .sh file, and run it then run it again the night of the sync, and then run a cleanup a couple of days later in case there were any stragglers. This isn’t going to work for everyone. A lot of people will use custom settings in mail apps. If necessary, you can also configure ports for both servers with –port1 and –port2. You can also configure SSL, synchrinization options, regular expression conversions on objects during the migration, include and exclude items with mail folders, move passwords into a file, etc. Before doing the sync, I’d recommend syncing a test mailbox and reading the entire manpage at https://github.com/imapsync/imapsync.

Changing MX and getting mail to actually flow to the new servers is just a matter of making sure there’s an A Record for the new mail server and putting the MX to that. I recommend setting the TTL of your dns records for mail servers as low as your DNS server or registrar will allow until the migration process is complete. The reason for this is that you want to keep the time frame that mail could flow to both servers at a minimum. The final sync is because DNS changes aren’t instant. Some DNS servers get a lot of traffic and so don’t respect the TTL for a given record. Therefore, that final sync pulls everything that might have accidentally been flowing into your old server into your new server. 

Next, you’ll need to change the host name of the mail server in mail clients and hopefully have users reset their passwords so you don’t have access to their mail any longer. For this, I recommend pushing a profile (e.g. using an MDM or command line equivalent). 

I have seen a number of environments (and helped in some cases) get really crafty. They might present a user with a forced “change password” dialog and then have that stored in a file that admins can’t access. It’s in clear text and there’s always risk. But this allows for non repudiation. It also means that when you send a profile to a device you can have the new account show up and work without the user having to enter a password. Every time users have to touch something, there’s the chance it will get mistyped (I typo things all the time) and there’s a chance that they’ll be confused and call you, so the larger the user base you’re migrating, the more logic you’ll hopefully be able to apply to this process to help keep your phone from ringing. I’ve also seen environments where admins had users type in the password, monitor the sync, and then proceed using client-side scripts. This has always been fraught with peril, but offers an added sense of privacy. 

Don’t forget to grab the mailbox. Seems like this is the main reason I’ve had to revive dead servers. Something got put there and someone needs it… Migrating that mailbox the same as you would any other is a good idea, just-in-case. If you don’t know your quarantine address, run the following to find it:

sudo serveradmin settings mail:postfix:spam_quarantine

Once you’re sure that no mail is flowing to the old server (72 hours is usually a good time frame), you can pull the old server offline. I recommend keeping the server or a clone of the server forever. I’ve needed to revive them here and there due to a variety of reasons that have nothing to do with data integrity of what was migrated. You never know. And if you’re a consultant, there’s no easier way to get fired than to go mucking about with access to mail without a lot of communication in advance. 

Overall, this process can be pretty seamless to your users. But it requires more labor on your side. To keep costs and effort down for you, you could type up a document that steps people through things, but I prefer people at work liking me, so wouldn’t do that personally. Good luck and please comment here if you have further tools or workflows that you prefer!

February 15th, 2018

Posted In: Mac OS X Server

Tags: , , ,

Leave a Comment

Been working on a new plugin to embed device details from Jamf Pro into Jira Service Desk. It looks a little like this:


To access the plugin, see the links below.

February 13th, 2018

Posted In: JAMF

Tags: , , , , ,

Leave a Comment

Last week, Apple finally shipped my new HomePod (and by finally, I mean exactly when they said they would). And setting it up couldn’t have been easier. Even easier than setting up my first Echos. So here’s the deal. Plug in the HomePod and then when it boots up you’ll see an overlay on an iOS device (iPhone, iPad, etc). You’ll want to use the device that has an AppleID you want to use on the HomePod (e.g. the one that your Apple Music account is using). When you see the Set Up button, tap it.



You can then select a location for the HomePod. This is important mostly if you’re going to have multiple HomePods around. Select a location and then tap Continue.



At the Personal Requests screen, tap Enable Personal Requests if you want the device to allow access to your iCloud account for things like, sending a message (note: unintended consequences include but are not limited to children deleting bad report cards, adding weird items to the grocery list, and sending messages from one parent to the other).



At the Terms and Conditions screen, tap Agree if you agree to the terms; otherwise put the device back in the box and return it.



At the Accounts and Settings screen, you can transfer settings to the HomePad, which gives the HomePod access to the wi-fi password for your network (so your phone doesn’t have to be close to the HomePod for it to work).



Next, you need to ask Siri a question.



I recommend asking “Siri, how are you today?”



Once configured, you can go to Settings and AppleID to see the HomePod.



From there, you can see the model, version, serial, and if you happened to configure the HomePod to work with the wrong AppleID, you can tap Remove from Account to be able to configure the device with a different account.



And finally, open the Home device and you’ll see your device. 



From there, tap on the device and you’ll have a few more settings for how the HomePod works with the Home app. Here, you can change the room, change the AppleID, choose to include in the Favorites of your home screen, and disable access to Explicit Content. 



Scroll down and you can choose to share HomePod Analytics. Notice that this is opt-in and they’re clear about how they’ll use it if you enable it. 



So the setup is simple. I’ll have another article for configuring some home automations, so you can control them with the HomePod.

February 12th, 2018

Posted In: Home Automation, iPhone

Tags: , , ,

One Comment

Autopkgr is basically a small app that allows you to select some repositories of recipes and then watch and run them when they update. It’s a 5 minute or less installation, and at its simplest will put software packages into a folder of your choosing so you can test/upload/scope to users. Or you can integrate it with 3rd party tools like Munki, FileWave, or Jamf using the JSSImporter. Then if you exceed what it can do you can also dig under the hood and use Autopkg itself. It’s an app, and so it needs to run on a Mac. Preferably one that doesn’t do much else. 


Installing Autopkgr

You can obtain the latest release of Autopkgr at https://github.com/lindegroup/autopkgr. To install, drag the app to the Applications folder. 

When you open AutoPkgr for the first time, you’ll prompted for the user name and password to install the helper tool (think menu item). 

The menu item then looks like the following.

These are the most common tasks that administrators would run routinely. They involve checking Autopkg recipes to see if there are new versions of supported software titles, primarily. Opening the Autopkgr app once installed, though, shows us much more. Let’s go through this screen-by-screen in the following sections.


Moving AutoPkg Folders Around 

By default, when installed with Autopkgr, Autopkg stores its cache in ~/Library/AutoPkg/Cache and the repos are sync’d to ~/Library/AutoPkg/RecipeRepos. You can move these using the Choose… button in the Folders & Integration tab of Autopkgr, although it’s not necessary (unless, for example, you need to move the folders to another volume). 

Note: You can also click on the Configure AutoPkg button to add proxies, pre/post processing scripts, and GitHub tokens if needed. 


Keeping Autopkg and Git up-to-date

The Install tab is used to configure AutoPkg settings. If there is a new version of AutoPkg and Git, you’ll see an Install button for each (used to obtain the latest and greatest scripts); otherwise you’ll see a green button indicating it’s up-to-date. 

You can also configure AutoPkgr to be in your startup items by choosing to have it be available at login, and show/hide the Autopkgr menu item and Dock item. 


Configuring Repositories and Recipes

Repositories are where collections of recipes live. Recipes are how they’re built. Think of a recipe as a script that checks for a software update and then follows a known-good way of building that package. Recipes can then be shared (via GitHub) and consumed en masse. 

To configure a repository, click on the “Repos & Recipes” tab in Autopkgr. Then select the repos to use (they are sorted by stars so the most popular appear first). 

Note: There are specific recipes for Jamf Pro at https://github.com/autopkg/jss-recipes.git.

Then you’ll see a list of the recipes (which again, will make packages) that AutoPkgr has access to. Check the ones you want to build and click on the Run Recipes Now. 

If you don’t see a recipe for a title you need, use the search box at the bottom of the screen. That would show you a given entry for any repos that you’ve added. Again, all of the sharing of these repos typically happens through GitHub, but you can add any git url (e.g. if you wanted a repo of recipes in your organization. 

Once you’ve checked the boxes for all the recipes you want to automate, you can then use the “Run AutoPkg Now” option in the menu items to build, or rely on a routine run, as described in the next section.


Scheduling Routine Builds

Autopkgr can schedule a routine run to check recipes. This is often done at night after administrators leave the office. To configure, click on the schedule tab and then check the box for Enable scheduled AutoPkg runs. You can also choose to update your recipes from the repos by checking the “Update all recipes before each AutoPkg run” checkbox.


Getting Notified About New Updates To Packages

I know this sounds crazy. But people like to get notified when there’s a new thing showing up. To configure a variety of notification mechanisms, click on the Notifications tab in AutoPkgr.

Here, you can configure alerts via email, Slack, HipChat, macOS Notification Center, or via custom webhooks.


Integrating Autopkg with Jamf (and other supported vendors)

When integrating with another tool, you’ll need to first install the integration. To configure the JSSImporter, we’ll open the “Folders & Integrations” tab in Autopkgr and then click on the Install JSSImporter button.

Once installed, configure the URL, username and password (for Customer API access) and configure any distribution points that need to have the resultant packages copied to. 


Once the JSSImporter is configured, software should show up in Jamf Pro scoped to a new group upon each build.  It is then up to the Jamf Administrator to complete the scoping tasks so software shows up on end user devices.


What the JSSImporter Does from Autopkg

This option doesn’t seem to work at this time. Using the following may make it work:

sudo easy_install pip && pip install -I --user pyopenssl

Note: The above command may error if you’re using macOS Server. If so, call easy_install directly via 

/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/easy_install.py.

February 9th, 2018

Posted In: Mac OS X, Mac OS X Server, Mac Security, Mass Deployment

Tags: , ,

Leave a Comment

“Taking a new step, uttering a new word, is what people fear most.” ― Fyodor Dostoyevsky, Crime and Punishment

The Apple Wiki Server is sadly going away. I always liked this service. It was thoughtfully designed and looked much nicer than most of the other tools available out there. Sure, you couldn’t write articles offline, write in markdown, or do a lot of other things that I’ve learned to both love and hate from other solutions, but honestly it always felt the most Apple of services in macOS Server because it didn’t have every-single-checkbox. So, I’ll pour a little Jaëger on the ground in memory of the wiki server and then… export some stuffs and move on.

Before we get started, let’s talk about where you’re going to be putting this stuff. You can export in three formats (in order of the likelihood you’ll use them): 
  • wxr: the native WordPress format that can also be used by a variety of other solutions as WordPress is their reference competitor of sorts. WordPress also has wiki plugins, which you could experiment once you’ve imported all of your stuff. ExpressionEngine, Drupal, and many other solutions support importing via the wxr format. For importing into confluence check out https://wiki.afm.co/display/PUBL/HOW+to+import+Wordpress+into+Confluence which requires you to run your own server temporarily if your ultimate goal is to move to the Atlassian Cloud (which is pretty much what I ended up doing). 
  • pages: A folder that stores static html, rather than the dynamic html files that the wiki services builds. You might use this if you just want to take a permanent archive of the wiki service and maybe hire an intern to cut/copy/paste pages into a new wiki solution, like Confluence. 
  • json: If you’re going to be scripting a custom import into another solution then json is likely to be your best bet. I blame python. There are more modules than I can count to import that assist with manipulating json. If another wiki or documentation tool has an import option, you can find a way to get it in. You’ll likely encounter broken links, etc. Unless you correct those in the script during import. Which is a lot of logic. 
  • legacy: Uses PostgreSQL. Probably not useful for a lot of people. I’ve done some work reverse engineering that database, but it changes routinely and so that work is put out of date at regular intervals.
  • decoded: A more swifty export. I think I’d just use this if I were building a swift app based on my export. Which I can’t imagine doing.

The export command is now built into wikiadmin (unlike a couple of previous articles I wrote where it was a standalone command back in the 10.5 or 10.6 era). So to export, you’ll run wikiadmin followed by the export verb. The next option you’ll provide in the command is either -all or the -name of the wiki (multiple wikis can be provided with  comma separated values) and then the -format you’re exporting into (listed above), and finally the -path of the destination. To put that all together, would look like the following if we’re exporting into WordPress: 

sudo /Applications/Server.app/Contents/ServerRoot/usr/bin/wikiadmin export -name Legal -format wxr -path /exports

Exports will be owned by _teamserver so to get to the data, you’ll have to chmod the files:

sudo chmod -R /exports

Inside that target directory, you’ll see a number of files as you can see in the following screenshot:

So, for wxr there will be a directory called wiki that doesn’t have much in it. And then there will be an xml file for each user that has a “page” as well as another with all the articles in it. You’ll need to look at them, but typically the biggest and last one exported (so the last one in the directory listing) will have all your articles.

Once you have the correct XML file, you can import it! To do so, go to WordPress, hover over tools in the sidebar and click on Import. At the import screen, select the xml file, and then click on the “Upload file and import” button.




Note: If the import fails, you may have to edit your php.ini to increase post_max_size or upload_max_filesize. But once the import is complete, I’d change those back to the defaults.

Now, let’s say instead I was going to json. In the following iteration of the earlier command, I’m going to do -all and then I’m going to do json as the -format:

sudo /Applications/Server.app/Contents/ServerRoot/usr/bin/wikiadmin export -all -format json -path /exports

This output starts as follows (I left off the subsequent articles:

{

  “LongName” : “Legal”,

  “UpdateTime” : “2018-02-09T08:37:41.691-0600”,

  “Description” : “test”,

  “ExportDate” : “2018-02-09 16:06:51 +0000”,

  “Revision” : 1,

  “UpdatedByLogin” : “charles.edge”,

  “BlogPosts” : [

  ],

  “Theme” : “carbon,,”,

  “WikiUID” : “c7de0acb-baae-baae-e9bd-f78d3e4d43ed”,

  “ShortName” : “legal”,

  “Pages” : [

{

  “LongName” : “test page 1”,

  “UpdateTime” : “2018-02-09T08:38:32.025-0600”,

  “UpdatedByLogin” : “charles.edge”,

  “Revision” : 3,

  “PageTextValue” : “To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete. When you edit this page, you can easily rename the page, and use the editing toolbar to: Apply paragraph or character styles to text. Create bulleted lists, numbered lists, and tables. Insert media, such as images, audio, or QuickTime movies. Attach files. Insert an HTML snippet from another website or email. For more information about editing pages, click the Action (gear) button and choose Help. 1 1”,

  “TinyID” : “x4j836N4C”,

  “Tags” : [

” test”,

“wtf”

  ],

  “CreateTime” : “2018-02-09T08:38:07.039-0600”,

  “RelatedItems” : [

{

  “LongName” : “Legal”,

  “UID” : “c7de0acb-baae-baae-c6c2-aae392690186”

}

  ],

  “CreatedByLogin” : “charles.edge”,

  “RenderedPage” : “<div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32\” class=\”block wrapchrome text\” data-guid=\”24c701f8-0e15-49a7-b331-f8b755cb6a32\” data-type=\”text\” contenteditable=\”false\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-wrapper\” class=\”wrapper wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-inner\” class=\”inner wrapchrome\”><div class=\”content selectable wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-editable\” class=\”editable wrapchrome\”><p>To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete.<\/p><p>When you edit this page, you can easily rename the page, and use the editing toolbar to:<\/p><ul><li>Apply paragraph or character styles to text.<\/li><li>Create bulleted lists, numbered lists, and tables.<\/li><li>Insert media, such as images, audio, or QuickTime movies.<\/li><li>Attach files.<\/li><li>Insert an HTML snippet from another website or email.<\/li><\/ul><p>For more information about editing pages, click the Action (gear) button and choose Help. 1 1<\/p><\/div><\/div><\/div><\/div><\/div>”,

  “RevisionHistory” : [

{

  “ChangedByLogin” : “charles.edge”,

  “Version” : 1,

  “PageTextValue” : “”,

  “RenderedPage” : “<div class=\”block text\”><div class=\”content\”><div class=\”editable\”><p>To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete.<\/p><p>When you edit this page, you can easily rename the page, and use the editing toolbar to:<\/p><ul><li>Apply paragraph or character styles to text.<\/li><li>Create bulleted lists, numbered lists, and tables.<\/li><li>Insert media, such as images, audio, or QuickTime movies.<\/li><li>Attach files.<\/li><li>Insert an HTML snippet from another website or email.<\/li><\/ul><p>For more information about editing pages, click the Action (gear) button and choose Help.<\/p><\/div><\/div><\/div>”,

  “ChangeType” : “create”,

  “ChangeTime” : “2018-02-09T08:38:07.039-0600”

},

{

  “ChangedByLogin” : “charles.edge”,

  “Version” : 2,

  “PageTextValue” : “To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete. When you edit this page, you can easily rename the page, and use the editing toolbar to: Apply paragraph or character styles to text. Create bulleted lists, numbered lists, and tables. Insert media, such as images, audio, or QuickTime movies. Attach files. Insert an HTML snippet from another website or email. For more information about editing pages, click the Action (gear) button and choose Help. 1”,

  “RenderedPage” : “<div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32\” class=\”block wrapchrome text\” data-guid=\”24c701f8-0e15-49a7-b331-f8b755cb6a32\” data-type=\”text\” contenteditable=\”false\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-wrapper\” class=\”wrapper wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-inner\” class=\”inner wrapchrome\”><div class=\”content selectable wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-editable\” class=\”editable wrapchrome\”><p>To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete.<\/p><p>When you edit this page, you can easily rename the page, and use the editing toolbar to:<\/p><ul><li>Apply paragraph or character styles to text.<\/li><li>Create bulleted lists, numbered lists, and tables.<\/li><li>Insert media, such as images, audio, or QuickTime movies.<\/li><li>Attach files.<\/li><li>Insert an HTML snippet from another website or email.<\/li><\/ul><p>For more information about editing pages, click the Action (gear) button and choose Help. 1<\/p><\/div><\/div><\/div><\/div><\/div>”,

  “ChangeType” : “edit”,

  “ChangeTime” : “2018-02-09T08:38:26.194-0600”

},

There’s a glaring omission in this article: files. I’ll get to that later in another article. But the gist is that you can do a webdav and move them, but you kinda’ break any links…

Finally, if I came up short in this article (as I often do), the official wikiadmin man page: 

wikiadmin export -name somewiki,anotherwiki -path /var/tmp/two-exported-wikis

wikiadmin import -all -path /var/tmp/two-exported-wikis

wikiadmin import -all -path /var/tmp/two-exported-wikis/Exported.wikis

wikiadmin export -all -format json -path /var/tmp/readable-wikis

wikiadmin export -all -format pages -path /var/tmp/browsable-wikis

wikiadmin export -all -format wxr -path /var/tmp/wxr-wiki-files

wikiadmin migrate -r /Volumes/SnowLeopard/Library/Collaboration

RETURN VALUES

wikiadmin returns a status code of 0 for success. In the event of failure it returns a non-zero status, and writes error messages to stderr.

FILES

/Library/Server/Wiki/Logs/wikiadmin.log

Log file for wikiadmin activity

/Library/Server/Wiki/Logs/collabd.log

Log file for Wiki http server activity

/Library/Server/Wiki/, /tmp/, /var/tmp, /Users/Shared

Folders readable and writable by user _teamsserver where exports can typically be placed

<export-path>/Exported.wikis

The name of the bundle created when exporting wikis for formats other than pages.

<export-path>/FileData/*/*/*

For -format pages – exported files that were uploaded to the wiki as files or embedded in pages

<export-path>/wiki/projects/*/*.html

For -format pages – exported wiki main pages and user profile pages

<export-path>/wiki/pages/*/*.html

For -format pages – exported wiki pages

<export-path>/css/*.css

For -format pages – style sheets for use in browsing the exported content

<export-path>/index.html

For -format pages – a generated landing page with links to all exported wikis and pages

HISTORY

The wikiadmin command first allowed export in macOS Server 3.2. The packaging format for exported wikis was revised with macOS Server 4.1, but the

new wikiadmin still supports importing from the older export formats.

macOS Server March 30, 2017 macOS Server

February 8th, 2018

Posted In: Mac OS X, Mac OS X Server

Tags: , , , , ,

Leave a Comment

In this article, I looked at enabling SMB and AFP shares via the command line for macOS:

Setup the File Sharing Service in macOS 10.13, High Sierra

One thing I din’t cover is enabling SMB sharing for a specific user. This is different as passwords need to be stored in an SMB hash. And you can set that hash type with the pwpolicy command. So to do so, we’ll run the command with the -u option so we can supply the username, the -sethashtypes followed by SMB-NT as the hashtype followed by “on” as can be seen here:

pwpolicy -u charles.edge -sethashtypes SMB-NT on

The interpreter then asks for a password (which can be supplied programmatically with expect if this is done while creating the account:
Password for authenticator charles.edge:Setting hash types for user <charles.edge></charles.edge>

February 2nd, 2018

Posted In: Mac OS X, Mac OS X Server

Tags: , , , ,

Leave a Comment

Many of the people that read my articles undoubtedly arleady know this, but Apple has announced a sharp reduction in the number of services provided. Per this article, the Calendar, Contacts, DHCP, DNS, Mail, Messages, NetInstall, VPN, Websites, and Wiki services are being deprecated and Apple has provided a few services, per service, that they recommend moving to. Those services, per the above article, include the following:

Calendar

Contacts

DHCP

DNS

Mail

Messages

NetInstall

VPN

Websites

Wiki

I’ve been saying many of these services/features should go away in macOS Server so the developers could focus on providing an excellent experience and solid QA/unit testing for the services/features that remain. The fact that apps are being swiftified is great, as it speaks volumes to the future of the services themselves. The fact that Apple is reducing the number of licenses they’re tracking and the mistake they’re allowing customers to make is also great.

Having said that, every time I think that a service should go away, I hear from someone that they rely on that service. Most of this feedback comes from consultants who have made the server a central part of their consultancy. As someone who used to plan services as products for customers in consultancies, if you find yourself in similar situations when planning where services go when Apple retires them, I would strongly recommend looking at SaaS solutions where customers can give you a login and you can help guide them into a new and better solution. At least, that’s the way I positioned most of these services in the last version of the macOS Server book…

Yes, it was great having Apple handle all of the patching and customers were able to take advantage of a lot of technology with very few resources. However, that’s just not where we are any more. And rather than argue about it or try emailing Tim Cook or make petitions or even complain, save your cycles and look for new and better replacements for each service (preferably not ones that require physical servers, provided that customers are okay with that)! 

And stay tuned. I suspect we’ll cover this on an upcoming episode of the Mac Admins Podcast! 😉

What are your thoughts? Remorse? Applause?

January 25th, 2018

Posted In: Mac OS X, Mac OS X Server

Tags: , , ,

Here’s a quick extension attribute to check for OSX/MaMi. Basically, I’m just looking for one of the two DNS servers that always gets put into the list. From what I’ve seen they always come in pairs.  Below is a gist.
Oh and thanks Katie for noticing my errant space! 😉

January 23rd, 2018

Posted In: Uncategorized

Thanks, Doug. I actually giggled…

January 18th, 2018

Posted In: Uncategorized

Spinnaker seems kinda’ complicated at first, but it’s not. To get it up and running, first install cask:

brew tap caskroom/cask
brew install brew-cask

Then redis and java:

brew install redis
brew cask install java

Download spinnaker from https://github.com/spinnaker/spinnaker.git (I dropped mine into ~/usr/local/build/spinnaker). From your spinnaker folder make a build directory and then run the script to update source:

mkdir ~/usr/local/spinnaker/build
cd ~/usr/local/spinnaker/build
~/usr/local/spinnaker/dev/refresh_source.sh –pull_origin –use_ssh –github_user default

From your build directory, fire it up:

~/usr/local/spinnaker/dev/run_dev.sh

Now run hal to see a list of versions:

hal version list

Then enable the version you want (e.g. 1.0.0):

hal config version edit –version 1.0.0

Then apply the version:

hal deploy apply

Then connect to fire up the UI:

hal deploy connect

Viola, now it’s just a GUI tool like anything else!

January 4th, 2018

Posted In: Mac OS X

Tags: ,

Next Page »