Carbonite is a great tool for backing up Macs and Windows devices. To install Carbonite, download it from www.carbonite.com
. Once downloaded, copy the app to the /Applications directory and open the app.
The Carbonite app will then install the components required to support the backup operations and index the drive.
Next, you’ll see some basic folders that will be backed up. Check the box for those you want to add to the backup (or do this later) and click the Install button.
Click Open Carbonite.
Notice that the backup has begun! The only really customer-installable action is to select the directories to be backed up, which is done using the left-hand sidebar.
And that’s it. There aren’t a lot of other options in the GUI. You can access more options at /Library/Preferences/com.carbonite.carbonite.plist.
krypted April 12th, 2018
Posted In: Mac OS X
backup macOS, carbonite, macos
The DNS service in macOS Server was simple to setup and manage. It’s a bit more manual in macOS without macOS Server. The underlying service that provides DNS is Bind. Bind will require a compiler to install, so first make sure you have the Xcode command line tools installed. To download Bind, go to ISC at https://www.isc.org/downloads/. From there, copy the installer locally and extract the tar file. Once that’s extracted, run the configure from within the extracted directory:
./configure --enable-symtable=none --infodir="/usr/share/info" --sysconfdir="/etc" --localstatedir="/var" --enable-atomic="no" --with-gssapi=yes --with-libxml2=no
Next, run make:
Then run make install:
Now download a LaunchDaemon plist (I just stole this from the org.isc.named.plist on a macOS Server, which can be found at /Applications/Server.app/Contents/ServerRoot/System/Library/LaunchDaemons/org.isc.named.plist or downloaded using that link). The permissions for a custom LaunchDaemon need to be set appropriately:
chmod root:wheel /Library/LaunchDaemons/org.isc.named.plist
Then start it up and test it!
krypted April 11th, 2018
Posted In: Mac OS X, Mac OS X Server
install bind, MAC, macos, os x, replace macOS server
Leave a Comment
Synology is able to do everything a macOS Server could do, and more. So if you need to move your VPN service, it’s worth looking at a number of different solutions. The most important question to ask is whether you actually need a VPN any more. If you have git, mail/groupware, or file services that require remote access then you might want to consider moving these into a hosted environment somewhere. But if you need access to the LAN and you’re a small business without other servers, a Synology can be a great place to host your VPN services.
Before you setup anything new, first snapshot your old settings. Let’s grab which protocols are enabled, running the following from Terminal:
sudo serveradmin settings vpn:Servers:com.apple.ppp.pptp:enabled
Next, we’ll get the the IP ranges used so we can mimic those (or change them) in the new service:
sudo serveradmin settings vpn:Servers:com.apple.ppp.l2tp:enabled
Now let’s grab the DNS servers handed out so those can be recreated:
sudo serveradmin settings vpn:Servers:com.apple.ppp.pptp:IPv4:DestAddressRanges
Finally, if you’re using L2TP, let’s grab the shared secret:
sudo serveradmin settings vpn:Servers:com.apple.ppp.pptp:DNS:OfferedServerAddresses:_array_index
sudo serveradmin settings vpn:Servers:com.apple.ppp.l2tp:DNS:OfferedServerAddresses:_array_index
Once we have all of this information, we can configure the new server using the same settings.
sudo serveradmin settings vpn:Servers:com.apple.ppp.l2tp:L2TP:IPSecSharedSecretValue
To install the VPN service on a Synology, first open the Synology and click on Package Center. From there, click on All and search for VPN.
Then click on the Install button for VPN. Once installed, open VPN Server from the application launcher in the upper left-hand corner of the screen. Initially, you’ll see a list of the services that can be run, which include the familiar PPTP and L2TP, along with the addition of Open VPN.
Before we potentially open up dangerous services to users we might not want to have access to, click on Privilege. Here, enable each service for each user that you want to have access to the VPN services.
Now that we can safely enable and disable each of the services, click on PPTP in the sidebar of the VPN Server app (if you want to provide PPTP-based services to clients).
Here, check the box for “Enable PPTP VPN server” and enter the following information:
- Dynamic IP address: The first DHCP address that will be given to client computers
- Maximum connection number: How many addresses that can be handed out (and therefore the maximum number of clients that can connect via PPTP).
- Maximum number of connections with the same account: How many sessions a given account can have (1 is usually a good number here).
- Authentication: Best to leave this at MS-CHAP v2 for compatibility, unless you find otherwise.
- Encryption: Leave as MPPE optional unless all clients can do MPPE and then you can enforce it for a stronger level of encryption.
- MTU: 1400 is a good number.
- Use manual DNS: If clients will connect to services via names once connected to the VPN, I’d put your primary DNS server in this field.
Click Apply and open port 1723 so clients can connect to the service. If you’ll be using L2TP over IPSec, click on “L2TP/IPSec” in the sidebar. The settings are the same as those above, but you can also add a preshared key to the mix. Go ahead and check the enable checkbox, provide the necessary settings from the PPTP list, and provide that key and then click on Apply. Note that the DHCP pools are different between the two services. Point UDP ports 1701, 500, and 4500 at the new server to allow for remote connections and then test that clients can connect.
That’s it. You’ve managed to get a new VPN setup and configured. Provided you used the same IP address, same client secret, and the ports are the same, you’ll then be able to probably use the same profile to install clients that you were using previously.
krypted April 6th, 2018
Posted In: Mac OS X Server, Mac Security, Synology
Apple, l2tp, mac clients, macos, macos server, migrate vpn, pptp, Synology
Leave a Comment
People who have managed Open Directory and will be moving to Synology will note that directory services really aren’t nearly as complicated was we’ve made them out to be for years. This is because Apple was protecting us from doing silly things to break our implementations. It was also because Apple bundled a number of seemingly disparate technologies into ldap. It’s worth mentioning that LDAP on a Synology is LDAP. We’re not federating services, we’re not kerberizing services, we’re not augmenting schemas, etc. We can leverage the directory service to provide attributes though, and have that central phone book of user and group memberships we’ve come to depend on directory services to provide.
To get started, open the Package Center and search for Directory. Click Install for the Directory Server and the package will be installed on the Synology.
When the setup is complete, open the Directory Server from the launcher available in the upper right hand corner of the screen.
The LDAP server isn’t yet running as you need to configure a few settings before starting. At the Settings screen, you can enable the LDAP service by checking the box to “Enable LDAP Service” and providing the hostname (FQDN) of the service along with a password.
Once the service is configured, you’ll have a base DN and a bind DN. These are generated based on the name provided in that FQDN field. For example, if the FQDN is “synology.krypted.com”, its Base DN will be “dc=synology,dc=krypted,dc=com”. And the Bind DN would add a lookup starting a root, then moving into the users container and then the hostname: uid=root,cn=users,dc=synology,dc=krypted,dc=com
If this is for internal use, then it’s all setup. If you’ll be binding external services to this LDAP instance, make sure to open ports 389 (for LDAP) and/or 636 (for LDAP over SSL) as well.
Once you have information in the service, you’ll want to back it up. Click on Backup and Restore. Then click on Configure.
At the Configure screen, choose a destination.
I prefer using a directory I can then backup with another tool. Once you have defined a place to store your backups using the Destination field, choose a maximum number of backups and configure a schedule for the backups to run (by default backups run at midnight). Then click OK. You now have a functional LDAP service. To create Groups, click on the Group in the left sidebar.
Here, you can easily create groups by clicking on the Create button. At the wizard, provide a group name and then enter the name of a group (accounting in this example).
Click Next, then Apply to finish creating the group. One you have created your groups, click on User to start entering your users. Click Create. At the User Information screen, enter the name, a description if needed, and the password for a user. You can also restrict password changes and set an expiration for accounts. Click Next to create the user.
At the next screen, choose what groups the new user will be in and click Next.
Enter any extended attributes at the next screen, if you so choose (useful for directories).
Click Next and then Apply.
For smaller workgroups, you now have a functional LDAP service! If you’d like a nice gui to access more options, look at FUM (
https://github.com/futurice/futurice-ldap-user-manager ), LAM ( https://www.ldap-account-manager.org/lamcms/ ), LinID ( http://www.linid.org/welcome/index.html )or other tools. I wrote an article on LDAP SACLs awhile back, so I’ll try and track that down and update it for Synology soon!
krypted April 5th, 2018
Posted In: Mac OS X Server, Synology
Apple, MAC, macos, macos server, migrate, move open directory to openldap, OpenLDAP, SACL, setup, users
Leave a Comment
Services that run on a Synology are constantly being updated. Software updates for the binaries and other artifacts can quickly and easily be updated. To do so, open the Synology web interface and then open Package Center. From Package Center, click Update for each or Update All to upgrade all services at once, as seen below.
You will then be prompted to verify that you want to run the update.
Any services that are being updated will restart and so end users might find those services unresponsive or have to log back in after the service comes back online.
krypted March 27th, 2018
Posted In: Network Infrastructure, Small Business, Synology
Apple, MAC, macos, NetApp, network appliance, packages, Synology
The past couple of years has forced me to rethink many of my recommendations for how you backup computers in small office and home environments. Previously, I would have said that you could use a disk attached to an Apple AirPort. But the AirPort Base Station is no longer being made. Previously, I would have said you could use Time Machine Server, a service built into macOS Server in 5.4 and below. But that service is no longer being made in macOS Server by Apple and is now found in the Sharing System Preference pane . Previously, I might have even said to use the home edition of CrashPlan, which could have backed up to their cloud and/or a home server. But that plan is no longer being offered by Code 42.
So what are we to do? Well, luckily now the offerings out there are just endless. One of those offerings is so easy, you can run out to Best Buy, return home with a WD (Western Digital) MyCloud.com drive, and be up and running in about 5 minutes. I’ll cover other options when I cover file services and Synology. But in the meantime, let’s look at setting up a WD MyCloud.com drive, account, and configuring both to work with Time Machine. Setup Your WD Hard Drive
First, we’ll setup the drive. This is pretty straight forward. Plug the ethernet cable into your network, wait for the drive to boot up, and then go to the MyHome setup page
Here, you’ll be prompted to setup a My Cloud Home account. Enter a name, email address, and password. Then click on Create Account.
You’ll then be prompted for the device you plugged in, which is discovered on the network. Click Connect.
Choose whether you want to share product improvement data. Ever since my team as a product manager I’m a huge fan of doing so, so I clicked Share.
Once that’s done, you’ll be prompted to get the desktop app. While not absolutely necessary, it’s not a bad idea. If you want the app, click Download.
Once the app is done downloading, open the directory and open the installer.
Click Install Now.
Once complete, you’ll see the menu bar. Click it and then add your device if you don’t see it by clicking on “I don’t see my device”
When prompted, enter your email address and password that you created earlier and then click on Sign In.
Next, in the notifications area for updating the software make sure to run that. There was a pretty bad vulnerability awhile back
and that will make sure you’re good. Then click on the name of your WD MyCloud Home.
Add IFTTT Alerts
I want to see when new updates, channels or options are added, so I’m going to enable that. To do so, click on Services in the sidebar. and then click on Enable for IFTTT.
Assuming the terms of service are acceptable, click “I Agree”
When prompted, choose to connect to IFTTT.
From the IFTTT site, click Connect.
Choose which options to give IFTTT for the MyCloud API.
Browse the channels and enable each that you’d like and then click “Turn on.” Mount the MyCloud Drive
Next, open a “Connect to Server” dialog box (Command-K from the Finder) and click on Browse.
Click on the MyCloud-XXX where XXX is the identifier for your MyCloud account.
Click on the timemachinebackup folder.
The folder should initially be empty. Now let’s open the Time Machine System Preference pane.
Click on “Select Backup Disk…”Choose Your MyDisk From Time Machine
Choose the TimeMachineBackup directory for the MyCloud Device and click on “Use Disk.”
You’ll then want to create a user for backing up. To do so, go back to the mycloud.com site and click on settings. Then click on “Add user…” and enter an email address.
The email address will get an email to setup an account. Do so and then once you’ve configured the user, enter the email address and password when prompted.
Now wait for the first backup to finish. If you ever see any errors, check them; otherwise, you should backup to the device as with a locally attached drive, but you won’t need to plug directly into the drive to run backups.Conclusion
This doesn’t solve for a lot of use cases that Time Machine Server would have been better for. But it’s a simple task that should cost you a little over a hundred bucks and get you backing up. I’m still a fan of cloud services. Backblaze, Carbonite, and others will backup your data for an annual fee of a little less than what a MyDrive costs. I’ll cover those in later articles, but for now, you’ve got a backup on your network, which even if you use one of those services is a great option in the event of hardware failure, as you can quickly get back up and running with a full system restore!
krypted March 12th, 2018
Posted In: Mac OS X, Network Infrastructure
app, Apple, backup, macos, mycloud, wd, wd mycloud
Autopkgr is basically a small app that allows you to select some repositories of recipes and then watch and run them when they update. It’s a 5 minute or less installation, and at its simplest will put software packages into a folder of your choosing so you can test/upload/scope to users. Or you can integrate it with 3rd party tools like Munki, FileWave, or Jamf using the JSSImporter. Then if you exceed what it can do you can also dig under the hood and use Autopkg itself. It’s an app, and so it needs to run on a Mac. Preferably one that doesn’t do much else.
You can obtain the latest release of Autopkgr at https://github.com/lindegroup/autopkgr. To install, drag the app to the Applications folder.
When you open AutoPkgr for the first time, you’ll prompted for the user name and password to install the helper tool (think menu item).
The menu item then looks like the following.
These are the most common tasks that administrators would run routinely. They involve checking Autopkg recipes to see if there are new versions of supported software titles, primarily. Opening the Autopkgr app once installed, though, shows us much more. Let’s go through this screen-by-screen in the following sections.
Moving AutoPkg Folders Around
By default, when installed with Autopkgr, Autopkg stores its cache in ~/Library/AutoPkg/Cache and the repos are sync’d to ~/Library/AutoPkg/RecipeRepos. You can move these using the Choose… button in the Folders & Integration tab of Autopkgr, although it’s not necessary (unless, for example, you need to move the folders to another volume).
Note: You can also click on the Configure AutoPkg button to add proxies, pre/post processing scripts, and GitHub tokens if needed.
Keeping Autopkg and Git up-to-date
The Install tab is used to configure AutoPkg settings. If there is a new version of AutoPkg and Git, you’ll see an Install button for each (used to obtain the latest and greatest scripts); otherwise you’ll see a green button indicating it’s up-to-date.
You can also configure AutoPkgr to be in your startup items by choosing to have it be available at login, and show/hide the Autopkgr menu item and Dock item.
Configuring Repositories and Recipes
Repositories are where collections of recipes live. Recipes are how they’re built. Think of a recipe as a script that checks for a software update and then follows a known-good way of building that package. Recipes can then be shared (via GitHub) and consumed en masse.
To configure a repository, click on the “Repos & Recipes” tab in Autopkgr. Then select the repos to use (they are sorted by stars so the most popular appear first).
Note: There are specific recipes for Jamf Pro at https://github.com/autopkg/jss-recipes.git.
Then you’ll see a list of the recipes (which again, will make packages) that AutoPkgr has access to. Check the ones you want to build and click on the Run Recipes Now.
If you don’t see a recipe for a title you need, use the search box at the bottom of the screen. That would show you a given entry for any repos that you’ve added. Again, all of the sharing of these repos typically happens through GitHub, but you can add any git url (e.g. if you wanted a repo of recipes in your organization.
Once you’ve checked the boxes for all the recipes you want to automate, you can then use the “Run AutoPkg Now” option in the menu items to build, or rely on a routine run, as described in the next section.
Scheduling Routine Builds
Autopkgr can schedule a routine run to check recipes. This is often done at night after administrators leave the office. To configure, click on the schedule tab and then check the box for Enable scheduled AutoPkg runs. You can also choose to update your recipes from the repos by checking the “Update all recipes before each AutoPkg run” checkbox.
Getting Notified About New Updates To Packages
I know this sounds crazy. But people like to get notified when there’s a new thing showing up. To configure a variety of notification mechanisms, click on the Notifications tab in AutoPkgr.
Here, you can configure alerts via email, Slack, HipChat, macOS Notification Center, or via custom webhooks.
Integrating Autopkg with Jamf (and other supported vendors)
When integrating with another tool, you’ll need to first install the integration. To configure the JSSImporter, we’ll open the “Folders & Integrations” tab in Autopkgr and then click on the Install JSSImporter button.
Once installed, configure the URL, username and password (for Customer API access) and configure any distribution points that need to have the resultant packages copied to.
Once the JSSImporter is configured, software should show up in Jamf Pro scoped to a new group upon each build. It is then up to the Jamf Administrator to complete the scoping tasks so software shows up on end user devices.
What the JSSImporter Does from Autopkg
This option doesn’t seem to work at this time. Using the following may make it work:
sudo easy_install pip && pip install -I --user pyopenssl
Note: The above command may error if you’re using macOS Server. If so, call easy_install directly via
krypted February 9th, 2018
Posted In: Mac OS X, Mac OS X Server, Mac Security, Mass Deployment
AutoPkgr, MAC, macos
In this article, I looked at enabling SMB and AFP shares via the command line for macOS:
Setup the File Sharing Service in macOS 10.13, High Sierra
One thing I din’t cover is enabling SMB sharing for a specific user. This is different as passwords need to be stored in an SMB hash. And you can set that hash type with the pwpolicy command. So to do so, we’ll run the command with the -u option so we can supply the username, the -sethashtypes followed by SMB-NT as the hashtype followed by “on” as can be seen here:
pwpolicy -u charles.edge -sethashtypes SMB-NT on
The interpreter then asks for a password (which can be supplied programmatically with expect if this is done while creating the account:
Password for authenticator charles.edge:Setting hash types for user <charles.edge></charles.edge>
krypted February 2nd, 2018
Posted In: Mac OS X, Mac OS X Server
add smb user in sharing using command line, macos, set hash type, SMB, terminal
Next Page »
Spinnaker seems kinda’ complicated at first, but it’s not. To get it up and running, first install cask:
brew tap caskroom/cask
brew install brew-cask
Then redis and java:
brew install redis
brew cask install java
Download spinnaker from https://github.com/spinnaker/spinnaker.git (I dropped mine into ~/usr/local/build/spinnaker). From your spinnaker folder make a build directory and then run the script to update source:
~/usr/local/spinnaker/dev/refresh_source.sh –pull_origin –use_ssh –github_user default
From your build directory, fire it up:
Now run hal to see a list of versions:
hal version list
Then enable the version you want (e.g. 1.0.0):
hal config version edit –version 1.0.0
Then apply the version:
hal deploy apply
Then connect to fire up the UI:
hal deploy connect
Viola, now it’s just a GUI tool like anything else!
krypted January 4th, 2018
Posted In: Mac OS X