Workflow Updated for iOS 11 and the iPhone X

Anybody who knows me knows that I love Workflow for iOS. I followed its development prior to its initial launch and, about a week after it launched, I began helping them with customer support. A few months later I would be hired on full time and have the privilege of working with the wonderful Workflow team for two years. Since being acquired by Apple in March 2017, Workflow updates seem less frequent. I’m still optimistic that this is for a good reason, yet to be revealed. Fortunately, the recent 1.7.7 update may help those whose faith had become shaky in recent months.

With 1.7.7, Workflow is updated to support the iPhone X, as well as drag and drop on iOS 11. I mentioned on Twitter a few weeks ago that I would love an app similar to the shelf app concept, except rather than just storing data and providing it in multiple available data types, it would intelligently run actions on the data you drop into it, based on which action you drop it on. Well, there are plenty of shelf apps — including my favorite so far, Gladys — but with the latest update, Workflow meets the needs I was discussing before. Now you can drag content and drop it into a workflow in the Workflow app, and it will run that workflow using the dropped content as input.

This is something I’ve been hoping to see since iOS 11 launched and it makes sense that Workflow would be the app for this. Now you can drop an image into a workflow and have it uploaded to Dropbox, or drag a link from Safari into a workflow to share it with predefined recipients and text.

To drag and drop content as input, simply drop the item on the Run Workflow button (the one that looks like a play button) — that entire rectangular section is a drop zone and will become highlighted to indicate this when content is dragged over it. From the main workflow view, you can hover over a workflow with dragged content for a second and the workflow will open, allowing you to drop the item(s) inside the workflow to run it.

I made a quick test workflow after updating, which accepts a URL for an article and generates markdown formatted text in the format:

# Title
By: Author on Publish Date
> Excerpt
**From:** URL

I can drag a link from the address bar in Safari, and drop it on the Run button at the top of the workflow. This will take that URL as input and run it through the Get Article from Webpage action, and then extract all the desired details into a single Text action using Workflow’s powerful and efficient magic variables system.

I hope to see continued development and, as I’ve mentioned multiple times here and elsewhere, I remain optimistic about the future of Workflow. I’d like to see better drag and drop support for reordering workflows in the main view (something that is and always has been very buggy for me) as well as in the Settings for Watch and Today Widget workflows. The custom drag and drop implementation used for building workflows and moving actions has always been good to me, and I don’t know if it could be improved by adopting the new standard from Apple (or perhaps they have in this version?). There’s always room for improvement and I am confident we will see more from them over the next year but, for now at least, I’m happy to see the development continue.

In addition to drag and drop and iPhone X support, 1.7.7 also brings support for the new image formats of iOS 11, some new health data types, new iOS 11 files support such as multiple files with the Get Files action, and some other improvements — much welcome in its first iOS 11 update. You can view the entire list of features and fixes here.

If you’re not using Workflow already, check it out. The app is free since it was acquired so you have nothing to lose but time (which you just might gain back with Workflow). Be sure to check out the official documentation, r/workflow on Reddit, and the in-app gallery for help, examples, and inspiration.

MacStories: A Photo Tour of Apple’s New Flagship Chicago Store

Over at MacStories, John Voorhees published a nice photo tour of the new Apple Store that just opened in Chicago. The photos probably don’t do it justice:

It’s hard to appreciate just how completely the store disappears into its surroundings unless you see it for yourself. The illusion is enhanced by even the smallest touches like the grooves in the staircases that continue from the interior of the store to its exterior.

It looks incredible and I hope to get to see it in the near future, and the photos are pretty great.

On Personalizing “Hey Siri”

From Apple’s Machine Learning Journal, in a piece about what goes on behind the scenes on your devices when you say “Hey Siri”:

We designed the always-on “Hey Siri” detector to respond whenever anyone in the vicinity says the trigger phrase. To reduce the annoyance of false triggers, we invite the user to go through a short enrollment session. During enrollment, the user says five phrases that each begin with “Hey Siri.” We save these examples on the device.

We compare any possible new “Hey Siri” utterance with the stored examples as follows. The (second-pass) detector produces timing information that is used to convert the acoustic pattern into a fixed-length vector, by taking the average over the frames aligned to each state. A separate, specially trained DNN transforms this vector into a “speaker space” where, by design, patterns from the same speaker tend to be close, whereas patterns from different speakers tend to be further apart. We compare the distances to the reference patterns created during enrollment with another threshold to decide whether the sound that triggered the detector is likely to be “Hey Siri” spoken by the enrolled user.

This process not only reduces the probability that “Hey Siri” spoken by another person will trigger the iPhone, but also reduces the rate at which other, similar-sounding phrases trigger Siri.

I found this whole thing very interesting, even as I am not experienced in the ways of machine learning. I found it particularly interesting because of something that happened last week: My wife and I were sitting on the couch and I used “Hey Siri” for something. Out of curiosity, I checked to see if it triggered hers, and indeed it did not. With my iPhone, iPad, and Apple Watch at the ready, I had her try to trigger my devices multiple times, with no success.

It’s neat to see what goes into helping Siri reduce the chances of these false activations. Granted, my wife is a female with a slight Mexican accent (only very slightly). The chance of false activation would be higher with another male speaker, I imagine, but the fact that it is able to store and use the enrollment examples to cut down on this is still really cool.

Link: ‘Your’ vs. ‘My’ (Daring Fireball)

I have to agree with John Gruber’s assessment of Tom Warren’s review on The Verge regarding the new iPad Pro and iOS 11 beta:

Tom Warren’s review for The Verge of the new iPad Pro and iOS 11 beta is headlined “iOS 11 on an iPad Pro Still Won’t Replace Your Laptop”. Exactly in line with my piece yesterday, that “your” should be a “my”.

As well as this:

Again, Apple is not trying to convince everyone to replace a traditional Mac or PC with an iPad. Apple executives say that the Mac has a bright and long future because they really do think the Mac has a bright and long future. Any review of the iPad and iOS 11 from the perspective of whether it can replace a MacBook for everyone is going to completely miss what is better about the iPad and why.

I think he is right on point with that. Apple isn’t trying to replace the Mac entirely. They still have plans and the Mac still has a future. They made quite a show of it at the WWDC keynote this year with new hardware and new technologies for AR and VR.

Sure, the iPad isn’t a Mac. And neither is the Mac an iPad. Where I see the strengths of using one device for a particular purpose, you might see weakness of using that same device for your own needs, and vice versa. I don’t think there is any valid debate as to whether or not the iPad Pro (and perhaps even an iPad Air) can replace a laptop. It’s not a fringe case. You can argue whether or not it can replace a laptop for everyone but you’d be missing the point entirely.

When I handled customer support for Workflow, I saw plenty of people doing plenty of work with iPads alone – and I do mean real work. MacStories is a perfect example of work being done iPad-first (or mostly iPad only). Hell, most of my work in the last two years was done from an iPad, and an Air 2 rather than a Pro at that. iOS 11’s new iPad productivity features certainly aren’t a setback to this. I don’t think there is any reason to suggest Apple is trying to force or convince everyone to go iPad only and toss out their Macs, and I don’t see them sheltering the Mac anytime soon. What i see with iOS 11 is Apple trying to make the iPad a better, more productive work environment for those who want to use it as such.

I have a 2014 MacBook Air and while I’d love to replace it with a newer MacBook Pro, I do love my Mac and I don’t plan on giving it up. I have a lot of great software there and I enjoy the look and feel of the system, the hardware, all of it. I feel the same for iOS and the iPad (and the iPhone), especially in iOS 11. Going forward, (and getting my first iPad Pro tomorrow) I have a feeling I will be doing most things from my iPad. I have my reasons, some of them being portability, speed, and the enjoyment I get from using a multi-touch-sensitive piece of glass capable of running great software, side by side, without a clutter of windows, and with fewer distractions.

There are other reasons but ultimately I enjoy using it and for what I need to do, the iPad is sufficient. There are plenty of people for whom the iPad Pro is sufficient enough for the work they need to do, but who feel more comfortable with a traditional keyboard/mouse interface. I get the feeling they’ll be able to continue doing so for a long time, but the iPad is a better way to work than a laptop for me.

Source: Daring Fireball

Link: How to make $80,000 per month on the Apple App Store

Johnny Lin tells of a scam in the form of a #10 Top Grossing productivity app on the Apple App Store with horrific grammar and spelling. Their app claims it can scan your entire device for viruses and malware — something that sandboxing on iOS does not allow — for an easy-to-miss $99.99 in-app purchase (and that’s just for a 7-day subscription). Lin explains better:

Touch ID? Okay! Wait… let’s read the fine print:

“Full Virus, Malware scanner”: What? I’m pretty sure it’s impossible for any app to scan my iPhone for viruses or malware, since third party apps are sandboxed to their own data, but let’s keep reading…

“You will pay $99.99 for a 7-day subscription”

Uhh… come again?

It’s crazy that an app can get through the review like this, and crazier to me that enough people could fall for it, but alas it/they did. Working in customer support for an iOS app, it became clear that a lot of iOS users aren’t aware of sandboxing, or don’t understand how it works. Being educated on this fact would likely curb the number of victims in such a scam. Obviously there are bigger issues here, but it might help to educate less tech literate friends or family.

Source: https://medium.com/@johnnylin/how-to-make-80-000-per-month-on-the-apple-app-store-bdb943862e88

Optimistic iOS User

Some thoughts on why I am optimistic about the future of productivity, automation, and professional use of iOS — specifically the iPad.

Earlier this week, in the keynote for their annual Worldwide Developers Conference, Apple announced the next version of iOS: iOS 11. The announcement of the next version of iOS was expected, of course, but there some changes I wasn’t expecting quite yet but was happy to see, particularly in regards to the iPad which, as they joked on stage, is being turning up to 11.

We all have our own critiques, and I’m no different, but I’m generally an optimist about these things. iOS certainly has its flaws, as an software platform will, but the announcements from Monday do a lot to move the iPad forward as a legitimate computer; something some of us already saw it as, despite room for improvement. I expected Apple to eventually implement better features for multitasking (and perhaps even drag and drop), although I was expecting to have to wait another year or so — what a pleasant surprise. I haven’t gotten my hands on the beta yet, but I plan to try the public beta later this month (after a proper backbup, of course!). That said, I remain optimistic about the productivity improvements coming to the iPad.

The revamped control center and app switcher — resembling something more like Mission Control on macOS, the improved dock, the implementation of drag and drop, screen recording, and instant markup for screenshots — just to name a few — will undoubtedly enable many iPad power users to get more done more efficiently than before and, hopefully, draw in new iPad users and an increase in use for existing-but-less-frequent iPad users, pushing further it’s place in the wide world of computing devices.

I know a lot of people were concerned with Apple’s acquisition of Workflow. Some who rely on the app for work and other reasons were worried that this might be the end of Workflow and perhaps lead to less powerful automation on iOS. One of the reasons I can remain hopeful about the future of iOS — and specifically the iPad — in terms of automation, productivity, and professional work is that after being a part of the Workflow team for a little more than two years, I can say that they loved working on Workflow as much, or more, than I did: developing it, improving it, introducing new features to our customers. They were passionate about it. While I don’t have any inside information, I personally don’t think they would have made such a deal if they didn’t think they could continue to do these things in some manner at Apple. I should reiterate that I’m not basing this on anything other than what I saw working with them for a couple of years so I could certainly be wrong but, as I said before, I am optimistic.