Mostly an iPad Followup

Somewhat appropriately, I finished and edited my previous post about switching to an iPad for many of my computing use cases on an iPad. Luckily I have access to one which belongs to my employer and happens to be running the iOS 11 Beta. Since I was traveling at the time, I could also grab the Magic Keyboard which usually sits on my desk for typing purposes. Here are some observations.

First of all: this obviously wasn’t an exact facsimile. The iPad Air 2 I was using has a slightly smaller screen and significantly less processing power than the iPad Pro I’m considering. It also doesn’t support the Apple Pencil, so that’s not something I couldn’t test in the field. Lastly, much as I love the Magic Keyboard (it might be my favourite keyboard ever, in fact) it’s far from an exact replication of the iPad Pro Smart Keyboard. It’s a bit nicer to type on from what I can tell, it’s nowhere near as mobile, and it doesn’t attach to the device for more laptop like use.

With all of that in mind, the first thing I’d note is that the experience of typing text was really good. I did some work in iA Writer, then moved it into Ulysses in the interests of trying something new (for the record I’m typing this in Bear for much the same reason). All of this worked really nicely. Ulysses in particular came pretty close to pulling me all the way over.

When it came to editing it was great to be able to move to a chair, flip the device to portrait and change the context as much as possible. I found that the on-screen soft keyboard was more than equal to making minor edits here, especially with the improved keyboard in iOS 11.

One thing which didn’t work quite so well was moving large chunks of text around. This is pretty trivial with a mouse or trackpad. It’s also very easy with a hard keyboard (and its curser keys). With iOS’s on screen selection system I found this much more awkward. Hopefully this is just a matter of inexperience and I’m actually missing a key insight which will make it much easier. Perhaps there will be improvements when apps start using the new iOS 11 APIs.

Moving content between apps is still pretty awkward, but again I expect iOS 11 will improve this. Moving your attention between different apps I actually find to work really well on the iPad, though. The much more formal system of how apps appear on the screen seems to be easier for me to handle intuitively (or at least reflexively).

Ulysses has the built in functionality to export directly to Medium and Wordpress. It would be really good if it had the same functionality for Ghost, because the Ghost web UI really doesn’t scale well to the iPad sized screen. It seems to fall into something of a blind spot. Hopefully the new editor in Ghost 1.0 will improve this, but it doesn’t seem to have been rolled out to my blog yet. In fact thanks to Wordpress’s very solid mobile app, this whole process would have been much simpler if I was using Wordpress[1].

I also spent some time using Affinity Photo to develop RAW files taken with my mirrorless camera. The short summery is that it works really, really well on the iPad. It was a little slow at times, but given I was using it on the lowest specced device it’s compatible with that’s quite understandable. If I was doing anything more delicate than developing I think the Apple Pencil would have been very nice to have. Doing this kind of work whilst comfortable sitting crosslegged in a sofa is very nice indeed.

The experience of getting the photos out of the Mirrorless camera and onto the iPad was also pretty good. The iPad SD Card based connection kit does its job really well. I suspect the USB based version is probably the better option most of the time, though. I’ll probably switch to that and put up with also needing to carry an additional cable. Whilst it was straight forward it was a little slow. Wireless transfer of RAW files (which tend to be big) would be way more convenient, but slower still. It might be worth it.

A quick aside about the way in which the Photos app stores and displays RAW files: If you shoot in RAW+JPEG (so your camera outputs both a RAW file and a developed JPEG), Photos displays this as a single photo. Any changes you make are applied only to the JPEG. Photos itself barely even acknowledges that the RAW exists. Affinity will open the RAW file by default when you import from your photo library , but it would be awesome if it could also act as an extension and allow you to redevelop the RAW file from with Photos.

I also used it as a media device during the flight[2]. So much nicer than the built in option. The Netflix UI is lightyears ahead of any of the grotesque “entertainment systems” I’ve encountered on a plane. The iPad screen is much better than any in-seat screen. As an added bonus: When the pilot or cabin crew decide it’s time to wax lyrical, you get to choose whether to listen, or stick with the entertainment you chose.

As final thoughts: mobility and battery life were both glorious. With a SIM card and a decent international plan it could be amazing.


  1. In actually fact, thanks to really solid tools like the Working Copy git client, a lot of this might even have been easier using Jekyll and GitHub Pages. I still really like Ghost and intend to keep using it for this blog, but I can’t deny that frustrations like this keep pilling up. I’m going to be looking at other options for some other side projects I have in mind. ↩︎

  2. I watched the “Netflix Original” move Spectral, in case you’re interested. I would recommend it for you if you (like me) are a fan of a) very well made B movies; or b) Doctor Who, with which it shares an attitude to “science”. ↩︎