Skip navigation

Ok, one thing I want to touch on right away. I’m reading through my old post about UniFi and wanted to touch on a few items related to that. Obviously a lot has changed in the past seven years and this probably warrants a more in-depth post but for now I’ll throw out some quick bullet points. It’s probably worth noting that I have worked with UniFi extensively since my last post in February 2013 – in fact the computer I am typing this on is connected via a UniFi Switch 8 and I have multiple APs at home, as well as our setup at work and many client sites with one or more APs.

Key differences:

  • Most APs now support standard 802.3 PoE. There are a handful of older APs that only support Ubiqiti’s passive 24V PoE and a handful of other devices that support both but have limits on their support in one direction or another. There are also quite a few newer devices that do not support the old passive standard. Thankfully there is a handy support matrix ( ).
  • The management stack has come a LONG way. It absolutely does run on Linux now (as well as Windows and macOS) and there are all kinds of places to run it other than a premise machine. My personal preferred approach is a tiny Linux VM (I use a Debian 9 VM with 2GB RAM, 1 CPU core and 30GB disk at home – and I’m sure it could be made smaller) but I’m interested in playing around with the various Docker images that are out there. Ubiqiti also offers their CloudKey management appliances (which are okay but pretty expensive for what you get), and it’s easy to run in the public cloud if need be.
  • Ubiqiti now offers a cloud management interface (which plugs into the local controller, wherever it happens to run). I don’t have a ton of experience with it, but it’s certainly handy.
  • Greatly expanded product line – in additon to many APs, there are switches, camera gear, lighting, firewalls, etc. There’s also a ton of non-UniFi gear in their product line. How good that gear is in absolute terms varies a bit but the value for money is typically hard to beat.

Again this is probably something I could write several posts about but for now I wanted to throw out some commentary to update my post from 2013.

Well, not that long. Eighteen months with no new posts is still way less than my last update gap. Still wearing my consultant hat, still a coffee and beer lover, Windows 7/2008 has finally gone EOL, Palo Alto is buying everything in sight, Intel continues to get security egg on their face and the industry is as interesting as ever.

I don’t know that I will ever keep updating on a regular basis but perhaps I will. I certainly have plenty to say these days. I just got done moving the site from a relatively ancient CentOS 6/Apache/MySQL backend to a modern-ish Debian 10/nginx/MariaDB VPS (still at Linode and still using Cloudflare) which is probably worthy of a post on it’s own. My core Linux/UNIX background came from my high school days running FreeBSD 4.8 (and Apache 1.3!) on an old Pentium P5-200 in my bedroom. I’ve obviously kept those skills up to date but I’m still used to thinking in those terms (and I still support a lot of CentOS 6.x) so dealing with systemd and nginx config files requires a bit of extra thinking.

Anyway, perhaps more to come soon.

It’s been a while! Lots to catch up on – plenty of tech industry changes since 2013, even with the relatively few number of areas I specifically touched on previously. I still like Ubiquiti (even more now than five years ago), I spent a couple of years using macOS almost exclusively at my day job, Windows 8.1 was eclipsed by Windows 10, Dell divested SonicWALL, and oh yeah, I’m consulting these days.

Anyway, this is a long way of saying that I have a lot to talk about. Amazingly this dusty old WordPress has held up pretty well (and upgraded cleanly to the latest version!), so it looks like I’ve still got a venue for it. I’m sitting much more on the backend side of the house these days but I still have some opinions on the desktop area of things (to say nothing of mobile).

In a slightly interesting but not surprising twist, today reported that adoption of Windows 8.1 on the desktop has eclipsed that of OSX 10.9 (as reported by Net Applications).

This isn’t overly surprising (since Windows has such a commanding market lead, and 8.1 is a free upgrade) but it’s interesting to note. I’m one of the few people who mostly liked Windows 8, and I feel like 8.1 added a lot to help make the Metro (sorry, Modern UI) transition smoother.

With that said, it seems like both Apple and Microsoft are struggling a bit to figure out where to go next with their desktop OS. For that matter, I don’t think they are the only ones. I haven’t used Ubuntu in a while, but last time I did they seemed to be headed in a similar direction.

It will be interesting to see what’s next for the desktop as a whole.

Okay, I have a confession to make. I used to be a HUGE Apple snob, years ago. I cut my computing teeth on Apple IIs and Macs, and was a huge believer that the Apple Way was the Best Way right up until I started wanting to learn to program and otherwise go really in depth with my machines. Not something System 7/8/9 were very well suited to, not to mention the impossibility of building a Mac from scratch at that time.

So right around when MacOS 9.0.4 was new, I switched from the family iMac DV+ (a 400mhz G3 that my mother still uses once in a while) to a boring but cheap (important at age 13!) Wintel OEM box (a Compaq Presario 5062 if I recall correctly). From there,  I delved firmly into that world with a raft of 486 and P5/Pentium boxes built from scavenged parts (no better way to learn hardware in depth). I experimented freely with operating systems (or at least the ones I could get before I had a CD burner) – Windows 95/98, NT 4.0, Caldera OpenLinux (hah!), Red Hat 6 and 7 and probably others I’ve forgotten.

When I got into high school, my scavenging abilities increased with my income, and I finally got broadband and a CD burner, meaning I could experiment with anything I could get my hands on. I ran Windows 2000 Advanced Server as my desktop OS for a good couple of years. This was also when I started really messing around with systems stuff – I built a server out of an old P5-200 machine I’d gotten surplus from my dad’s then-employer, and learned FreeBSD 4.x. Soon after, I ended up building a second “server” running Windows 2000, and built my first WIMP machine (Windows, IIS, MySQL, PHP) to host a phpBB forum for me and my friends.

College rolled around, and with it lots of in-depth hands on time with Slackware, Windows Server 2003 and of course the venerable Windows XP, which remained my desktop OS of choice for a good long while. I learned C# .NET and Java, and got my first taste of Arch Linux which I found I really liked. I also reluctantly switched my primary desktop to Vista, because I had more than 4GB of RAM and I needed a 64-bit OS that I could still game on.

When Windows 7 hit RTM, it became the first Windows version I ever upgraded to on release day (and the first one that was usable out of the box on release day). Soon after, I left college and got my first real IT job, which was at a mostly Windows shop (Server 2003 and later 2008 in the back, and XP/7 up front). I managed to inject a bit of Linux later on.

These days, my employer is mostly RHEL in back and Windows 7 up front. This is a really roundabout way of saying I went over ten years without more than casual contact with a Mac. I occasionally would need to use one at school or something, but mostly found OSX really frustrating and UI-inferior compared to Windows and Linux. I did eventually own a Pismo G3 PowerBook I bought at Defcon 17 in 2009, but I mostly used it to boot OS9.1 for old games. It could boot to Panther, but I rarely used it that way – it was just too slow (384MB RAM and a 333mhz processor do not a good experience make). Plus, it lacked AirPort and didn’t have a working battery, so loading software either tethered me to an Ethernet cable (and the frustration of using way-outdated browsers), or required lengthy USB 1.1 file transfers (too old for FireWire).

More recently (this past spring), I ran across a G4 iBook at the local CeX and after confirming it supported Classic,  snapped it up. Again, my primary motivation was to run old software (Escape Velocity! Old shareware CD-ROMs!) but I was curious to play around with a slightly newer OSX version (Tiger, in this case. It would run Lepoard but sans-Classic). I found OSX sightly less intolerable than I had before and found a couple of things I even liked (Finder was slightly less awful! Network Tool was pretty cool!). TenFourFox and AirPort meant it was a lot more usable than the G3 and there was some fun nostalgia since it was from around the same era as when I had left Macs behind.

This (very lengthy) exposition leads up to today (or more specifically last week). Every year, the r/MyLittlePony community runs a charity event that includes a grid computing competition (Rosetta@Home, this year). I was messing around on Amazon to see if I could get any super-cheap machines to throw at my account, and discovered a number of Marketplace sellers selling Mac Minis for next to nothing. I was intrigued and considered buying one as a HTPC. I discarded that idea since 1) I already have an HTPC and it hardly gets used and 2) I rarely use desktop machines other than my primary desktop anyway. However, this did pique my curiosity enough to get me to price older Macbooks.

Sure enough, they were fairly inexpensive, too – the first and second generation Intel models (Core Duo and Core 2 Duo) were under $200 in some configurations. I dug around and found that my best bet was the second-gen Late 2006 models in that range – I got the 64-bit C2D proessor and the ability to run Lion if I really wanted to. Obviously this isn’t super up to date hardware but it’s a lot newer than anything I had gone in-depth with before. I ended up ordering a Late ’06 White 1.86ghz C2D model, and it arrived to me last Friday (shout out of Pacific Macs on the Amazon Marketplace for a great deal and fast shipping).

This is the first Mac I had bought with the express purpose of “get to know a somewhat modern version of OSX for once”. Armed with a Snow Leopard installation image, a copy of iLife ’11, and a couple of USB sticks, I dove in.

First order of business was creating a bootable USB stick (I didn’t have a factory OS CD, and EveryMac informed me that the DVD drive in my machine didn’t support dual-layer DVDs, so burning the 6GB DMG myself was a no-go). This turned out to be a pretty easy process, as the machine did come with a working Mountain Lion install (I just wanted to do it fresh). I copied the DMG on to an external hard drive, connected the drive and a USB stick to my new Mac, and used Disk Utility to restore the DMG to the USB stick, and I was set. There was a little confusion as I didn’t realize the stick would not show as a bootable device in the Startup Disk control panel, but once I RTFM’d, I saw I needed to reboot with the Option key held down and I was good to go.

Pretty painless (although slow) so far. I re-installed OSX, ran Software Update and went to bed. First pain point – when I woke up, I discovered that the machine had gone to sleep in the midst of downloading updates. Not the end of the world, but irritating nonetheless. I finished the updates, installed iLife (for Garage Band) and started playing around. All my usual software was either available or had easy Mac equivalents – Dropbox, Evernote, Firefox, Chrome, Putty (not needed), MS RDP, etc.

I found that I was enjoying using the machine but was frustrated at how slow disk IO was (a 5400RPM drive will do that). I was also concerned about disk use (60GB ain’t that big these days). Since this was now the Saturday after Thanksgiving (aka Black Friday weekend), there were some deals on. I started looking and discovered that Best Buy had an insane deal on an Intel 530 SSDs. The X25 in my primary desktop was getting a little long in the tooth, so I decided to trek over to my local store and see what I could find. I managed to score the 240GB model for a good price, and that freed up the X25 to go in the MacBook.

I was pleasantly surprised by how easy it was to do the physical install – I was expecting something on the order of the 15 step iFixit guide to my iBook G4, and it’s more like a 3 step process  – remove RAM shield, remove drive from machine, and remove drive from carrier, then reverse. It was perversely amusing to see Windows 7 attempting to boot from the SSD – I knew it was compatible, but I didn’t realize the Windows 7 bootloader was EFI compatible when it hadn’t initially been installed that way.

This is when I ran into my first real problem – I had used Disk Utility to image the previous hard drive, and when I tried to restore it, it failed with a completely unhelpful error message. A quick trip to Google confirmed that my image was hosed – turns out that trying to image the boot drive results in a silent failure. In retrospect, I should have just done a Time Machine backup and restored from that, but at the time it didn’t occur to me.

I quickly evaluated my options – I tried putting the original drive in a USB enclosure I had (to do a disk-to-disk copy), but found that it wouldn’t spin up when connected to the Mac. I suspect this was an issue with my cheap and very well used enclosure. At this point, I elected to just re-install – I could have found a different enclosure, or re-installed the original drive and taken a fresh image or Time Machine, but it didn’t seem worth it. I was also curious to see how much faster the install would be to an SSD target.

I didn’t time either install, but it didn’t seem like the SSD install was all that much faster – certainly not even close to the night and day difference it makes with Windows. It seems like the USB interface on this particular machine is quite slow, even considering it’s USB 2.0.

Anyway, this brings me up to now – I have a solid (if slightly old) OSX install with all my basic software. I’ve been using the machine heavily last night and today and am finding I like it quite a bit. This is a big admission for me, as I had found previous versions of OSX near-unusable for day to day stuff. It’s taken me a little time to get used to it, and I need to make a bullet list of likes and dislikes, but I am impressed so far.

However, that brings me to another point – I don’t see what’s BETTER than the alternatives (mainly Windows 7 and 8, given the sorry state of the Linux desktop). I could totally use OSX as my day-to-day OS, but I’m not sure if I understand yet why I would make that choice. There’s some neat features (Expose is fantastic, but is it better than Alt-Tab, Win-Tab and the Show Desktop hot corner? Network Utility is great, but I have those tools installed on my Windows machines anyway), but so far I haven’t seen anything compelling enough to make me want to switch full time.

Anyway, I am going to keep using it – this is a fantastic beater laptop and the battery is in surprisingly good shape for it’s age (I’m able to consistently get a little over three hours with WiFi and Bluetooth on). I’ve been sitting in Starbucks since roughly 4PM – it’s now just after 7 and I’m showing about 20 minutes remaining. Not as good as the 6+ hours I get from my MSI GE40 (which is great, but a little fragile) or my work Latitude E6430 (bulletproof but hefty), but for a $195 machine I am pretty happy.

I’m definitely looking forward to getting to know OSX better – I enjoy technology for technology’s sake, and since it seems to mesh with me a lot better than iOS (which I often find frustrating), I am happy to get familiar with it. Mabye at some point I’ll stop hitting Control instead of Command by rote.

Edit: I realized I forgot to note a couple of key points here. First (and most obviously), this is based on 10.6.8, which (while still fairly new) was released in 2009 and last updated (other than security updates) in 2011. Given that Windows 7 was released around the same time I think it’s mostly fair to compare them, but it’s undeniably true that Apple is (for better or for worse) on a fairly aggressive release schedule and so it’s still three major versions behind. Whether this makes my experience less valid is hard to say – I may upgrade my current MacBook to Lion at some point in which case I’m likely to do a follow up post. I suspect that will be less flattering, as Lion is when Apple really started to go nuts with skeuomorphism, which is something I find irritating at best and an impediment to useability at worst. In fact one of the reasons I so prefer Android to iOS is Google’s much lighter touch with skeuomorphs. I may do a follow up blog at some point about this and other UI items I find interesting – menu bar location, for example – I’ve always preferred the application-centric approach that Microsoft took with Windows, but I found that I hardly noticed the difference when I was using OSX, almost entirely because the menu bar has become so minimized in almost all applications on either platform.

I’d love to do an apples to apple comparison of OSX 10.9 and Windows 8.1 (which I’ve had quite a bit of hands on time with) but that’s not likely to happen anytime soon as I can’t really justify the purchase of a newer/more powerful Mac and I’ve never felt like running desktop OSs in a virtual machine netted a fair comparison (not to mention running OSX on non Apple hardware is a bit of a pain as well as being against Apple’s EULA). We’ll see what the future brings, however. I’m not a big fan of the mobileifcation (is that even a word) of desktop OSes and both Apple and Microsoft seem to have taken big steps in that direction.

This past week I was called upon to make some upgrades to our wireless buildout at my company’s corporate office. Some quick background: that office supports a user base of about seven people day to day, but because most of our workforce is remote, there can be signficantly more on-site at times.  This past week, we had several out-of-towners around (including myself and my boss), so things were under more stress than usual.

Anyway, the first day I was on site my boss (who is technical but not from a systems/networking perspective) noticed a signficant difference in Internet performance between wireless and wired connections at the home office. We use Skype pretty extensively for video chat, so if somebody has a marginal Internet connection it tends to get noticed pretty quickly. I ran some additional tests and found that being on wireless was cutting performance of the Internet connection by almost 50% (let alone the effect on local performance, though there are very few local services).  After hours, I re-ran my tests and found that things were pretty much back to normal. Okay, seems like a load issue.

We also had some (comparatively minor) signal strength issues in parts of the office (which was frustrating given it’s small size). I had also been dealing with a compatibility issue between the existing access point (built into our Sonicwall TZ205W firewall) and the Intel wireless cards on a pair of Lenovo X220 laptops that was causing their connections to drop periodically, requiring the wireless card to be disabled and re-enabled in order to get things going again. Because I’m remote, I had temporarily re-enabled the access point built into our Cisco UC320W PBX for the two affected people to use until I could investigate. The two units are right next to each other in the rack, so I have little doubt crowding the spectrum even more was unhelpful.

My boss was putting some pressure on me to deal with the issues in some fashion, and I was not overly happy with the existing wireless infrastructure. Given the signal strength issues, it seemed like a second or third AP would be a good idea. I looked into a couple of different options. First on my list was adding SonicPoints (Dell-Sonicwall’s standalone AP solution). That got discarded quickly because I needed them the next day and I didn’t have access to a vendor with both stock and an appropriate shipping commitment. (Side note: I wish Dell would better integrate Sonicwall purchasing into their direct ordering chain. When I bought the initial firewall, I had to figure out where the Sonicwall store was hidden on their site, and it didn’t even ship from Dell – it was dropshipped from Ingram Micro.)

I also looked at Meraki (same issue) before I came to Ubiquiti. I had heard of them before, and the buzz on Reddit (r/sysadmin and r/networking) was mostly positive. To boot, I could get a 3 pack kit from Amazon overnighted for just over $200. (It’s no substitute for a good IT vendor like CDW or MoreDirect, but Amazon Prime is *amazing* for certain things. We’ve saved buckets of money on shipping because of it.) I went ahead and ordered the basic kit – 3 of their standard model access points, mounting hardware and PoE injectors. As promised, it arrived the next day.

Setup was almost totally painless – the hardest part was figuring out where to put the access points . Note that the UniFi uses a non-standard PoE standard, so if you already have PoE on your network, you’ll need to get their adapter. I didn’t have PoE at this office, so no problem. I set up the injectors on the correct run (thank goodness I had everything well documented and labeled!), put out the access points and installed the controller software.

The other quirk here is said controller software – rather than having a web interface, the UniFi uses a Java app installed on a Windows or Mac PC (no Linux support I could find, though I also haven’t tried) to configure the access points. The machine running the controller software will also act as the Web server if you elect to use a captive portal for a guest network. This was the first little hitch I ran into – I didn’t have a suitable machine! I have no servers on site (or with direct connectivity), and because of our legacy as a remote company, everybody uses laptops. I did some quick digging on Ubiquiti’s site and found that I was okay after all. Once a config is applied, the controller machine isn’t necessary unless you are using a captive portal. The APs will go into a standalone mode and keep their config indefinitely.

Knowing that, I went ahead and used my laptop for now (we’ll eventually have at least a minimal server infrastructure there, for a domain controller, and I will move it at that point). The software is pretty slick – I was immediately prompted to upload a floorplan for the office (if I had one, which I did). I was then able to place the access points on the map, exactly where they were in the office. If you were managing a sizable deployment, it seems like that would be super handy.

I quickly configured a pair of networks, for employees and guests and was able to jump right on from a test machine. That’s about the extent of the testing I’ve done so far – I don’t yet have any employees on the new network (which will happen soon), but it seems to work well. I saw signficantly better performance than through the Sonicwall, though still less than I would have liked (project for later, I guess!).

My initial impressions of the UniFi platform are very positive:

  • Inexpensive and widely available.
  • Popular (easy to get help if needed)
  • Scalable (I could add additional APs as easily as plugging them in and adding them to the network via the management controller).
  • Lots of features (captive portal, etc)

The big cons I see thus far (not many):

  • Non-standard PoE requires adapters if you already have a PoE investment
  • No Linux support for management interface
  • Requires an always-on machine for captive portal functionality (which most people probably already have anyway).

I’ll update as I play with them more and my impressions change.

Hey everyone and welcome! I’ve done a quick sum up of who I am and why you should care on the about page.

A quick note: I’m going to pre-seed the blog with a number of posts to start with (covering a signficant real-life timescale), so if it looks like I went a little nuts with posts vs chronology, that’s why.