“Taking a new step, uttering a new word, is what people fear most.”
― Fyodor Dostoyevsky, Crime and Punishment
The Apple Wiki Server is sadly going away. I always liked this service. It was thoughtfully designed and looked much nicer than most of the other tools available out there. Sure, you couldn’t write articles offline, write in markdown, or do a lot of other things that I’ve learned to both love and hate from other solutions, but honestly it always felt the most Apple of services in macOS Server because it didn’t have every-single-checkbox. So, I’ll pour a little Jaëger on the ground in memory of the wiki server and then… export some stuffs and move on.
Before we get started, let’s talk about where you’re going to be putting this stuff. You can export in three formats (in order of the likelihood you’ll use them):
- wxr: the native WordPress format that can also be used by a variety of other solutions as WordPress is their reference competitor of sorts. WordPress also has wiki plugins, which you could experiment once you’ve imported all of your stuff. ExpressionEngine, Drupal, and many other solutions support importing via the wxr format. For importing into confluence check out https://wiki.afm.co/display/PUBL/HOW+to+import+Wordpress+into+Confluence which requires you to run your own server temporarily if your ultimate goal is to move to the Atlassian Cloud (which is pretty much what I ended up doing).
- pages: A folder that stores static html, rather than the dynamic html files that the wiki services builds. You might use this if you just want to take a permanent archive of the wiki service and maybe hire an intern to cut/copy/paste pages into a new wiki solution, like Confluence.
- json: If you’re going to be scripting a custom import into another solution then json is likely to be your best bet. I blame python. There are more modules than I can count to import that assist with manipulating json. If another wiki or documentation tool has an import option, you can find a way to get it in. You’ll likely encounter broken links, etc. Unless you correct those in the script during import. Which is a lot of logic.
- legacy: Uses PostgreSQL. Probably not useful for a lot of people. I’ve done some work reverse engineering that database, but it changes routinely and so that work is put out of date at regular intervals.
- decoded: A more swifty export. I think I’d just use this if I were building a swift app based on my export. Which I can’t imagine doing.
The export command is now built into wikiadmin (unlike a couple of previous articles I wrote where it was a standalone command back in the 10.5 or 10.6 era). So to export, you’ll run wikiadmin followed by the export verb. The next option you’ll provide in the command is either -all or the -name of the wiki (multiple wikis can be provided with comma separated values) and then the -format you’re exporting into (listed above), and finally the -path of the destination. To put that all together, would look like the following if we’re exporting into WordPress:
sudo /Applications/Server.app/Contents/ServerRoot/usr/bin/wikiadmin export -name Legal -format wxr -path /exports
Exports will be owned by _teamserver so to get to the data, you’ll have to chmod the files:
sudo chmod -R /exports
Inside that target directory, you’ll see a number of files as you can see in the following screenshot:
So, for wxr there will be a directory called wiki that doesn’t have much in it. And then there will be an xml file for each user that has a “page” as well as another with all the articles in it. You’ll need to look at them, but typically the biggest and last one exported (so the last one in the directory listing) will have all your articles.
Once you have the correct XML file, you can import it! To do so, go to WordPress, hover over tools in the sidebar and click on Import. At the import screen, select the xml file, and then click on the “Upload file and import” button.
Note: If the import fails, you may have to edit your php.ini to increase post_max_size or upload_max_filesize. But once the import is complete, I’d change those back to the defaults.
Now, let’s say instead I was going to json. In the following iteration of the earlier command, I’m going to do -all and then I’m going to do json as the -format:
sudo /Applications/Server.app/Contents/ServerRoot/usr/bin/wikiadmin export -all -format json -path /exports
This output starts as follows (I left off the subsequent articles:
“LongName” : “Legal”,
“UpdateTime” : “2018-02-09T08:37:41.691-0600”,
“Description” : “test”,
“ExportDate” : “2018-02-09 16:06:51 +0000”,
“Revision” : 1,
“UpdatedByLogin” : “charles.edge”,
“BlogPosts” : [
“Theme” : “carbon,,”,
“WikiUID” : “c7de0acb-baae-baae-e9bd-f78d3e4d43ed”,
“ShortName” : “legal”,
“Pages” : [
“LongName” : “test page 1”,
“UpdateTime” : “2018-02-09T08:38:32.025-0600”,
“UpdatedByLogin” : “charles.edge”,
“Revision” : 3,
“PageTextValue” : “To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete. When you edit this page, you can easily rename the page, and use the editing toolbar to: Apply paragraph or character styles to text. Create bulleted lists, numbered lists, and tables. Insert media, such as images, audio, or QuickTime movies. Attach files. Insert an HTML snippet from another website or email. For more information about editing pages, click the Action (gear) button and choose Help. 1 1”,
“TinyID” : “x4j836N4C”,
“Tags” : [
“CreateTime” : “2018-02-09T08:38:07.039-0600”,
“RelatedItems” : [
“LongName” : “Legal”,
“UID” : “c7de0acb-baae-baae-c6c2-aae392690186”
“CreatedByLogin” : “charles.edge”,
“RenderedPage” : “<div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32\” class=\”block wrapchrome text\” data-guid=\”24c701f8-0e15-49a7-b331-f8b755cb6a32\” data-type=\”text\” contenteditable=\”false\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-wrapper\” class=\”wrapper wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-inner\” class=\”inner wrapchrome\”><div class=\”content selectable wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-editable\” class=\”editable wrapchrome\”><p>To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete.<\/p><p>When you edit this page, you can easily rename the page, and use the editing toolbar to:<\/p><ul><li>Apply paragraph or character styles to text.<\/li><li>Create bulleted lists, numbered lists, and tables.<\/li><li>Insert media, such as images, audio, or QuickTime movies.<\/li><li>Attach files.<\/li><li>Insert an HTML snippet from another website or email.<\/li><\/ul><p>For more information about editing pages, click the Action (gear) button and choose Help. 1 1<\/p><\/div><\/div><\/div><\/div><\/div>”,
“RevisionHistory” : [
“ChangedByLogin” : “charles.edge”,
“Version” : 1,
“PageTextValue” : “”,
“RenderedPage” : “<div class=\”block text\”><div class=\”content\”><div class=\”editable\”><p>To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete.<\/p><p>When you edit this page, you can easily rename the page, and use the editing toolbar to:<\/p><ul><li>Apply paragraph or character styles to text.<\/li><li>Create bulleted lists, numbered lists, and tables.<\/li><li>Insert media, such as images, audio, or QuickTime movies.<\/li><li>Attach files.<\/li><li>Insert an HTML snippet from another website or email.<\/li><\/ul><p>For more information about editing pages, click the Action (gear) button and choose Help.<\/p><\/div><\/div><\/div>”,
“ChangeType” : “create”,
“ChangeTime” : “2018-02-09T08:38:07.039-0600”
“ChangedByLogin” : “charles.edge”,
“Version” : 2,
“PageTextValue” : “To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete. When you edit this page, you can easily rename the page, and use the editing toolbar to: Apply paragraph or character styles to text. Create bulleted lists, numbered lists, and tables. Insert media, such as images, audio, or QuickTime movies. Attach files. Insert an HTML snippet from another website or email. For more information about editing pages, click the Action (gear) button and choose Help. 1”,
“RenderedPage” : “<div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32\” class=\”block wrapchrome text\” data-guid=\”24c701f8-0e15-49a7-b331-f8b755cb6a32\” data-type=\”text\” contenteditable=\”false\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-wrapper\” class=\”wrapper wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-inner\” class=\”inner wrapchrome\”><div class=\”content selectable wrapchrome\”><div id=\”text-block-view-24c701f8-0e15-49a7-b331-f8b755cb6a32-editable\” class=\”editable wrapchrome\”><p>To edit this page, click the Edit (pencil) button. To delete this page, click the Action (gear) button and choose Delete.<\/p><p>When you edit this page, you can easily rename the page, and use the editing toolbar to:<\/p><ul><li>Apply paragraph or character styles to text.<\/li><li>Create bulleted lists, numbered lists, and tables.<\/li><li>Insert media, such as images, audio, or QuickTime movies.<\/li><li>Attach files.<\/li><li>Insert an HTML snippet from another website or email.<\/li><\/ul><p>For more information about editing pages, click the Action (gear) button and choose Help. 1<\/p><\/div><\/div><\/div><\/div><\/div>”,
“ChangeType” : “edit”,
“ChangeTime” : “2018-02-09T08:38:26.194-0600”
There’s a glaring omission in this article: files. I’ll get to that later in another article. But the gist is that you can do a webdav and move them, but you kinda’ break any links…
Finally, if I came up short in this article (as I often do), the official wikiadmin man page:
wikiadmin export -name somewiki,anotherwiki -path /var/tmp/two-exported-wikis
wikiadmin import -all -path /var/tmp/two-exported-wikis
wikiadmin import -all -path /var/tmp/two-exported-wikis/Exported.wikis
wikiadmin export -all -format json -path /var/tmp/readable-wikis
wikiadmin export -all -format pages -path /var/tmp/browsable-wikis
wikiadmin export -all -format wxr -path /var/tmp/wxr-wiki-files
wikiadmin migrate -r /Volumes/SnowLeopard/Library/Collaboration
wikiadmin returns a status code of 0 for success. In the event of failure it returns a non-zero status, and writes error messages to stderr.
Log file for wikiadmin activity
Log file for Wiki http server activity
/Library/Server/Wiki/, /tmp/, /var/tmp, /Users/Shared
Folders readable and writable by user _teamsserver where exports can typically be placed
The name of the bundle created when exporting wikis for formats other than pages.
For -format pages – exported files that were uploaded to the wiki as files or embedded in pages
For -format pages – exported wiki main pages and user profile pages
For -format pages – exported wiki pages
For -format pages – style sheets for use in browsing the exported content
For -format pages – a generated landing page with links to all exported wikis and pages
The wikiadmin command first allowed export in macOS Server 3.2. The packaging format for exported wikis was revised with macOS Server 4.1, but the
new wikiadmin still supports importing from the older export formats.
macOS Server March 30, 2017 macOS Server
krypted February 8th, 2018
Posted In: Mac OS X, Mac OS X Server
confluence, import, macos server, wiki, WordPress, wxr
Leave a Comment
A wiki is a repository of dynamically created and managed content, or content created or edited by multiple users collaboratively. This article is about using the wiki service in macOS Server 5.2 (the Apple Server app running on 10.12/Sierra). I reference file services with WebDAV because it is a very nice integration piece that I think a lot of people will find pretty beneficial.
To get started with the Wiki service, first turn it on. This one isn’t heavily dependent on host names (other than being able to access the server from a browser) or directory services (other than being able to authenticate users, but local accounts are perfectly functional) and it doesn’t require the Websites service to be running as well. One should always have good working directory services and host names, still…
To enable the service, open the Server app and click on Wiki in the list of SERVICES in the List Pane.
There are two configuration options. The first is to select who is able to create wikis. Use the “Wikis can be created by” drop-down list to select “all users” if anyone with an account on the server should be able to create a wiki or “only some users” to bring up the Wiki Creators screen.
If only some users can create new wikis, use the plus sign (“+”) at the Wiki Creators screen to add users and/or groups to the list of users that can create wikis. Click on OK when all users and groups that can create wikis are added. In a school I would imagine that only teachers or IT staff would be able to create wikis. Once a wiki is created, pages inside the wiki can still be created by non-wiki creators.
The other option available is the handy dandy WebDAV interface to the wikis. When you enable this option, you can connect to a server from OS X or iOS via WebDAV and access files in each wikis document repository. To be clear, this option doesn’t provide access to the user documents, but does provide access to the wiki documents. We’re going to check the box for “Enable WebDAV access to Wiki files” and then click the ON button.
Once the service starts, click on the View Wiki link in the Wiki workspace in Server app.
Here, click on the Log in button and enter a user with access to the server, preferably one who can create wikis.
At the Wikis page, you will then see a list of all wikis you have access to. Note that the previous screen showed one wiki and now we see two. That’s because one of the wikis has permissions that allow “All unauthenticated users” access to the wiki, which we’ll describe shortly. The first thing most administrators will do is create a wiki. To do so, click on the plus sign (“+”) icon on the web page and at the resultant screen, click on New Wiki.
At the “Create a new wiki” prompt, provide a name for the wiki and a brief description for it.
Click on Continue.
At the Set permissions screen, enter each user or group to provide access to edit and view wiki pages. Here, you’ll have the options for Read & Write (users can view and edit pages in the wiki), Read only (users can only view the contents of your pages) and No access (users have no access to the wiki). There is a group for All logged in users, which includes every user with access to the server and another for All unauthorized users, which includes guests to the server. Once you’ve given the appropriate permissions, click on Continue.
Note: You don’t have to get this perfect now as you can always edit these later.
At the Set Appearance screen, you can choose an icon for the wiki (shown in the wiki list and when you open the wiki) as well as a color scheme for the wiki. Choose the appropriate appearance for your wiki (again, you can always change this later) and then click on the Create button.
Once the setup is finished, you’ll see the Setup complete modal. Here, you can click on Go to Wiki button.
Once you’ve created your first wiki, let’s edit it and customize the content. To do so, click on it from the list of available wikis. Click on the cog-wheel icon and then Wiki Settings… to bring up the Wiki Settings page.
Here, you’ll see the previously entered name and description as well as options to enable Calendar (only available if Calendar Server is running on the server) and Blog, which enables a blog service for the wiki (wiki administrators can post blog entries to the wiki). Click on Appearance.
Here, you will have the previous two options as well as the ability to upload a banner (which should be 62 pixels high) and background for each wiki.
Click on Permissions. Here, you’ll see the permissions previously configured as well as options to configure who can comment on articles (nobody disables comments completely) in the wiki and whether comments require approval (moderation).
Click on Save. Now, let’s edit the splash page. To do so, click the pencil icon in the top navigation bar.
At the edit screen, the top nav bar is replaced by a WYSIWIG editor for managing the page. Here you can justify, link, insert media and of course edit the text you see on the screen. I recommend spending some time embedding links, inserting tables, making text look like you want it to and editing the content to reflect the purpose of the wiki. Click Save when you’re done. Click the pencil again to edit it, and let’s create a new wiki page. Keep in mind that link wikipedia, each page should be linked to from other pages in the order they should be read. Unlike most wikis, there’s actually an index page of all the articles, which can come in handy.
From the edit page, to create a new page and link to it, enter some text (or lasso some) that you’ll use as the link to access the new page you’re creating. Then click on the arrow and select “New page.”
Note: Use Enter URL to link to an existing page or an external website, instead of creating a new page.
At the New Page screen, provide a name for the new page (the lasso’d text automatically appears as the Page Title) and click on the Add button.
Click Save and then click on the newly created link. You can now edit the new page the same way you edited the previous pages. Click on the disclosure triangles in the right sidebar to Comment on articles, link articles to related articles, tag articles and view editing history.
Now for the fun part. Click on Documents. Here, you’ll see the pages you already created. Click on the plus sign and select the option to Upload File to the wiki.
At the Upload File dialog, click on Choose File and then select a file to upload.
Click Upload when selected.
Then from the Finder of a macOS client, use the Go menu to select “Connect to Server”. Enter the name or IP of the server and then click on Connect.
Assuming you can access the server, you should then be prompted for a username and password. Enter it and click Connect. Eventually, the file(s) will display (it can take awhile according to your network speeds and how many files are in the directory). You can connect to this same screen through an iPad using a 3rd party WebDAV client or the build in options in Pages.
Managing wikis is as easy as its ever been, with the new options for appearance being a nice add-on. Active Directory integration is as easy as binding the server to Active Directory and using the accounts listed in Permissions of pages.
Now that iOS devices can edit wikis and many of the traditional word processing options are available in the wiki editor, consider what the Wiki can be. Could it replace text editing apps for iOS? Could the Wiki allow for more collaborative documents than a Word or other document editor? Could it keep from getting eaten like the rest of the homework? Could the comments in the Wiki be a good way for teachers to have students write responses to materials? Could the Wiki and the document management features allow your workers to access human resources documents and employee manuals? I know plenty of tech firms that use wikis to track information about the systems they manage.
Once you have all of this information, upgrading can seem downright scary. But fear not, there’s Carbon Copy Cloner. And once you’ve cloned, there’s wikiadmin. When doing an upgrade in place, the Wiki service is pretty straight forward to upgrade, but in many cases, due to aging hardware, wiki services are moving from an older computer to a newer computer. This can be done in one of two ways. The first is to “migrate” the data by copying the Collaboration folder onto the new system. The second is to “export” and “import” the data. I usually recommend doing a migrate where possible, so we’ll start with that method.
Note: Before getting started, make sure that the directory services side of things is good. If a user or group lookup for an object that owns, edits or has commented on a wiki fails then that wiki probably shouldn’t be migrated. Use the dscl or id commands to confirm that lookups are functioning as intended.
To migrate wikis from one server to another, first copy the Collaboration directory to the new server. In this example, the directory has been dropped onto the desktop of the currently logged in user. To migrate the data once copied, use the wikiadmin command, along with the migration option. The option requires the path to the Collaboration folder, defined with -r, as follows:
sudo wikiadmin migrate -r ~/Desktop/Collaboration
When moving wikis, you can take the opportunity to get rid of a few you don’t want (such as that test wiki from way back when). Or administrators may just choose to move a single wiki to a new server in order to split the load across multiple hosts. When doing so, use the same command as earlier, along with the name of each wiki that is being moved, along with the -g option. For example, if moving the Legal wiki:
sudo wikiadmin migrate -r ~/Desktop/Collaboration -g Legal
The second way of moving wikis around is to export and then import them. To do so, first export wikis on the old server, using the wikiadmin command along with the export option, which requires an –exportPath option and needs to be done, on a wiki-by-wiki basis. So to export that Legal wiki to a file called LegalWikiTMP on the desktop:
sudo wikiadmin export -g Legal --exportPath ~/Desktop/LegalWikiTMP
Next, copy the wiki to the new server and import it, using the import option along with –importPath to identify where the file being imported is located. Using the same location, the command would then be:
sudo wikiadmin import -g Legal --importPath ~/Desktop/LegalWikiTMP
Note: The ability to import a wiki also allows for an API of sorts, as you can programmatically create wikis from other sources. The ability to export also provides a way to move into another wiki tool if you happen to outgrow the options provided in Server and need to move to something more robust.
There is another way to move wikis, using pg_dump, copying the data and then using pg_restore to import the data once you’ve created the tables. This way is, in my opinion, the last resort if the standard wikiadmin commands aren’t working. In my experience, if I’m doing the migration this way then I’ve got other, bigger issues that I need to deal with as well.
These commands work best when the wiki service has been started so that the databases are fully built out. To start the wiki service from the command line, use the serveradmin command instead of the wikiadmin command. The serveradmin command is used with the start option and then wiki is used to indicate the wiki service, as follows:
sudo serveradmin start wiki
The service can also be stopped, swapping out the start option with a stop option:
sudo serveradmin stop wiki
In a few cases (this is the main reason I’m writing this article), the attachments to wikis don’t come over during a migration. To migrate the files that are used for QuickLook, downloading attachments, etc, use the serveradmin command to locate the directory that these objects are stored in:
sudo serveradmin settings wiki:FileDataPath
The output identifies the directory where these objects are stored. Placing the contents in the same relative path as they are to the output of the same command on the target server usually results in restoring them. Once moved, use the fixPermissions option to repair the permissions of any files from the source (if any changes to account IDs are encountered such as an export/import rather than an archive/restore in OD this can lead to odd issues:
sudo wikiadmin fixPermissions
Also use the rebuildSearchIndex option with the wikiadmin command to fix any indexing, once the permissions have been repaired:
sudo wikiadmin rebuildSearchIndex
And finally use resetQuicklooks to clear any cached Quicklook representations of objects that have been inserted into a wiki and might not display properly using Quicklook (you know you might need to do this if they look fine when downloaded but look bad with Quicklook even though QuickLook on the server can view the files just fine):
sudo wikiadmin resetQuicklooks
When done properly the migration can take awhile. Keep in mind that every tag, every article, every edit to every article and basically everything else is tracked inside the tables that you’re moving. While there might not be a ton of data in the Collaboration directory or in an export, all of the data needs to go to the right location. This can take a little time in environments that have a lot of articles, even if they’re really short articles…
krypted October 17th, 2016
Posted In: Mac OS X Server
Apple, Blog, cache, calendar integration, export, import, macos server, rebuild, wiki server
When running mailbox exports, move requests, etc in Exchange 201x you might get an error. This is because the Management Role Assignments have changed ever so slightly. In order to provide an account the ability to do certain tasks, you can use the New-ManagementRoleAssignment powershell cmdlet to process a request. To do so, pick a user (in this case the username is kryptedadmin) using the -User option and choose roles to assign (in this case, mailbox, export and import) using the -Role option. The command then looks as follows:
New-ManagementRoleAssignment -Role "Mailbox Import Export" -User kryptedadmin
To see if your roles were properly applied:
Get-ManagementRoleAssignment -Role "Mailbox Import Export" | ft Identity
krypted November 2nd, 2013
Posted In: Microsoft Exchange Server
exchange 2010, Exchange 2013, export, get-managementroleassignment, identity, import, mailbox, new-managementroleassignment, role assignment
When you are configuring ExtremeZ-IP as a print server, you will need to set up and configure each printer. However, if you already have setup and configured printer queues for the Windows server, you can import existing queues into ExtremeZ-IP. This can be done programatically via the ExtremeZ-IP EZIPUTIL command line tool.
EZIPUTIL has a number of options, whereby the SERVER option is used to configure global settings for ExtremeZ-IP, VOLUME is used to create, edit and delete print queues and PRINT is used to manage shared print queues. Each of the options also has a number of switches for the feature(s) that are being managed. These are structured as standard switches that are used in Windows batch scripting. The /IMPORT switch can be used to import print queues. By defining the WINDOWS setting for the import, you will recreate all printer queues from Windows. This command would look like the following:
EZIPUTIL PRINT /IMPORT:WINDOWS
Once the command has been completed, you can then list printer queues using the /LIST switch:
EZIPUTIL PRINT /LIST
Once you have created printer queues you will often end up needing to remove a queue or three. To remove a printer queue, you will use the /REMOVE switch along with a /NAME switch to specify the printer queue that you are removing. For example, to remove a queue called Accounting_499 you would use the following command:
EZIPUTIL PRINT /REMOVE /NAME:Accounting_499
The VOLUME option has a similar feature in the /REPLICATE_SMB switch, which allows you to replicate existing SMB/CIFS shares:
EZIPUTIL VOLUME /REPLICATE_SMB
The /REMOVE switch can also be used with the VOLUME option. If you have created volumes you can also remove those from the command line. For example, to remove a shared volume called Accounting_Files, you would use the following command:
EZIPUTIL VOLUME /REMOVE /NAME:Accounting_Files
krypted March 1st, 2011
Posted In: Mac OS X Server, Mass Deployment, Network Infrastructure, Windows Server
Command line, create queues, ExtremeZ-IP, import, option, PRINT, remove, switch, VOLUME /REPLICATE_SMB
The netsh command can be used to manage network interfaces, control routing and one of the lesser-used features that I’ve seen are to import and export service settings with Windows Servers. This can be especially helpful if you need to normalize data for import into another Windows server or to be normalized for use with another server platform. To export your DHCP information, from a command prompt in Windows you would run the netsh command along with the service you are exporting settings for (WINS, DHCP, etc). After the service identifier you would indicate the action being performed (ie – import or export in this context), followed by a file to dump the data to and finally the subset of the data (we’ll use all for convenience sake and throw the data into an easily locatable place on the root of the C Drive, which you obviously need access to for the copy):
netsh dhcp server export C:dhcpsettings.txt all
Now that you have exported the data, you can copy it to your other Windows Server box and import using the exact same command (assuming the file lives in the same place) but swapping out your export for an import:
netsh dhcp server import C:dhcpsettings.txt all
DNS is a different beast given that there is a special dnscmd command for managing that service. To export your DNS information:
dnscmd ServerName /enumrecords zonename @ /type A /detail > c:mydnssettings.txt
Or in CSV:
dnscmd /enumrecords zonename @ /Type A /additional > c:mydnssettings.csv
One of the most used services for Windows servers though, is as a filer. File shares are stored in a registry key at HKEY_LOCAL_MACHINESYSTEMCurrentControlSetServicesLanmanServerShares. You can browse here using regedt32 and then export the key. You would then use the Import option in the File menu (Windows 2003 uses Import whereas previous versions use Restore).
Note: Restoring this data will nuke and pave your existing shares on the box you’re running it on and in most cases you will need to restart appropriate services and/or the box to see the new settings.
krypted August 10th, 2010
Posted In: Windows Server
dnscmd, export, export settings, import, netsh, servername, Windows Server, zonename
DeployStudio has the ability to import a csv file that is populated with the MAC address and a few specific settings. This allows you to prepopulate the database with the names that you want each machine to have. If you purchase a lot of machines from Apple then you can get a list of MAC addresses, or, you can use a bar code scanner to scan them as you’re unboxing.
If you have a list of MAC addresses (en0), then you will need to format them in a very specific manner. Here, I have included a sample csv file with the data that goes into each field, which I have name DSImporter.csv
Once you paste the data that you’d like into the csv, provide the computer names (these can be pasted or compiled using formulas). Once done, save and then open Deploy Studio Admin. From here, click on Computers and then (as you would with iTunes) click on the plus sign (+) and create a new computer list (this step is optional, but I prefer to always import into computer lists, just in case something goes wrong, especially with my first import). Once you have created the computer list, you should see a screen similar to the following.
Next, click on the Server menu and select Import.
Now browse to your csv file and then click on the Import button. When the import is complete you will see a screen informing you as such. Click on the Done button to complete the process.
You will then see your computers listed in the database and should see the names that you assigned them listed as well. You can now set a workflow item in DeployStudio for Reconfigure system with computers database content (shown below). This will set the name (and any other fields you decided to use) from the spreadsheet that you imported into the computer list.
Once you have your computers in a group, you can also set a default workflow for them for their first time imaging, by clicking on the name of the group and then clicking on the Automation tab at the bottom as you can see below.
Here, you will set the workflow to run and optionally set the computer to not have a default workflow moving forward or just be disabled so users can’t accidentally reimage their computers later.
If you don’t have the MAC addresses for your computers ahead of time, you can use the Hostname option instead.
This will enable you to enter the computer name that you would like to use moving forward into the DeployStudio Runtime at imaging and then have it stored in the DeployStudio database, where it can be used to build future workflows or even be exported and imported into the Open Directory computers.
Overall, the computers and groups in DeployStudio Admin can be used to design more and more complex imaging sequences and to provide much of the scripting logic that a number of organizations need. Beyond that, JAMF, FileWave and a few other solutions offer even more logic and even more features or a little shell scripting can take you a really long way.
krypted August 3rd, 2010
Posted In: Mac OS X, Mass Deployment
Casper, Computer Lists, csv, DeployStudio, DSImporter.csv, en0, import, JAMF, MAC Address, order of fields
Originally Posted to the 318 TechJournal
318 has open sourced our mergeSafBookmarks python script. This tool can read in a pair of property lists and merge them into a single resultant bookmarks file for Safari. This takes a lot of the work out of pushing bookmarks to existing users as part of your deployment. You can find it here:
Note: The script also looks at existing bookmarks and doesn’t merge in duplicates.
krypted December 22nd, 2009
Posted In: Mac OS X, Mass Deployment
import, Mac OS X, Mass Deployment, merge bookmarks, programatically, Safari
Recently, I did an article for afp548.com
where I explained that you can import a pkcs12 file into an 802.1x profile using networksetup. In that type of environment you would be leveraging TLS or TTLS with the Mac OS X client acting as the supplicant and the certificate required to establish authentication with the authenticator. So you need the certificate to get started, but how do you get the pkcs12 and dish it out to clients programatically?
We’re going to start out with a new keychain where we’ve imported the certificate into that keychain (or skip this step if you already have a p12 file). First, find the certificate and verify the name, as this is very important to networksetup. For this, I like to use the security command’s find-certificate option. Here we’re going to look for radius.krypted.com:
security find-certificate -c radius.krypted.com
Now we’ll use the export verb of the security command to dump a .p12 file from the specially created keychain called 8021xkey,keychain to my desktop:
security export -k 8021xkey.keychain -t certs -f pkcs12 -o ~/Desktop/krypted.p12
When run you’ll be asked for a password to give the new p12 for decryption. Once we have the keychain it can easily be imported, as we will do from the desktop of a client system:
security import ~/Desktop/krypted.p12 -f pkcs12
Now we can use the p12 along with the -settlsidentityonsystemprofile or -settlsidentityonuserprofile. For example (using the default AirPort as the service and mysecretpassword as the password to decrypt the p12):
networksetup -settlsidentityonsystemprofile AirPort ~/Desktop/krypted.p12 mysecretpassword
Overall, at this point you can finally automate the process of setting up the 802.1x aspect of a deployment using a script or a package. Simply setup profiles at the GUI, import them into the new computer (assuming you have setup the service names before hand) and if need be import the certificate. Much testing required though…
krypted September 9th, 2009
Posted In: Mac OS X, Mac OS X Server, Mac Security, Mass Deployment
802.1x, export certificate, import, import profiles, Mac OS X, p12, pkcs12, Radius, supplicant, tls, ttls
Whether you’re going from Open Directory to Active Directory or from Active Directory to Open Directory, chances are you’ll encounter csvde along the way. Csvde is installed on Windows Server and allows you to interface with Active Directory using csv files. cvsde can import files using the -i switch, followed by the -f switch to indicate the file that you are importing, followed by the path of the file. So if you save a file called toimport.csv to the root of your c drive temporarily you would use the following command to import the objects in the rows of the file:
csvde -i -f c:toimport.csv
Now, what’s that file need. At a minimum the file needs to indicate the objectClass for each user, the users sAMAccountName and the dn. So this file
can be used to import a user called johndoe. But how to build a csv file like this from Open Directory? There are a number of ways, but here’s one way I’ve found works pretty well for me. First, let’s use dscl to dump a list of the long and short user names:
dscl /LDAPv3/127.0.0.1 -list /Users cn > import.txt
Now from Excel, click on File, Import and then select to import from a Text file, clicking Import. Then, browse to and double-click on your file, which if you used the above command would be called import.txt. Then, when it asks you for the Original data type, choose Fixed width. This will dump two columns. One with the short name, another with the name.
Now, download and open this spreadsheet
I made for ya’ll. Paste the shortname column into the sAMAccountName column. Then paste the column with the full name into the D column, where John & Jane Doe are now. Then copy the user (objectClass) entry in column A to the number of rows you actually have (they will all be users) and then copy the CN= in column C to all of the rows you need. Then the , from column E and finally the OU/Search Base information for your Active Directory will need to replace that of mine. So if your Active Directory domain is called contoso.com (don’t laugh, I’ve seen it in production) and the ou you are going to use is Users then replace this text with OU=Users,DC=contoso,DC=com. Once you have all of the information filled in per row, notice that row G will automatically update. If you look at the formula, I’m just merging the contents of rows C-F. Copy the contents of rows 2 and 3 into the cells for column F until the end of your users.
Now you can take the information from column B and paste it into the toimport.csv and then take the information for row G and paste it into column C of the toimport.csv file (using Paste Special and pasting only the Value, NOT the formula). The objectClass will need to be filled in as user for each user as well (easily enough, this is user). Passwords aren’t to be imported, so using the 3 attributes from toimport.csv along with the command initially referenced earlier in this article give it a shot.
There are a number of other attributes that you will likely want to pull in and maybe augment as well. However, it’s late and I’ll have to talk about those later. In the meantime, do 1-2 users at a time until you feel confident to let csvde rip on all 10,000. I also strongly recommend bringing the initial import into a unique OU so that you can remove them all easily if things go wrong.
krypted August 19th, 2009
Posted In: Mac OS X Server, Windows Server
Active Directory, csv, csvde, Excel, import, LDAP, ldif, merge fields, Open Directory