Using Workflow to get map images of locations

We just got back from vacation. By just I mean last weekend, but I may still be in denial that it’s over. I haven’t had a lot of time since, though, to write much, so I am working on that. In the meantime, I thought I’d just share a simple little workflow I sometimes find useful for saving locations.

There are plenty of ways to share your location with others these days, be it through the recently added features in Google Maps, the built-in options in Apple Messages, or some other service such as Glympse. There are also plenty of ways to save locations, using apps like Swarm (Foursquare) or, one of my favorites, Rego (more on that in the future, probably). Though not created specifically for saving locations, Day One is also a great app here, with its quick and easy Check-in feature, and ability to add location to a journal entry’s metadata. Usually, if I am saving a location, it is going to be by taking the location data from a photo using Rego, or creating a Day One journal entry. Sometimes, however, I want more than just a location’s coordinates, or a link to open it in Maps. Sometimes, I want to have a map image of the location.

While I could certainly take a screenshot in whatever map app I happen to be using, that requires cropping and potentially more. I want something I can access easily, tap, and get a map image of my current location, without having to tell it anything. Luckily Workflow provides a very simple way to get this, thanks to its ContentKit framework. When you pass input to an action in Workflow, that action will process the input based on the type of input it is expecting or capable of receiving. If you pass a photo into the Get Text from Input action, the output obviously wont be the photo. Workflow knows you want text, so it gets the only text associated with the input: the file name of the photo.

You might see where I’m going with this: Using this same concept, we can pass a location into the Get Images from Input action in a workflow. The only image that would be associated with location data, at least as far as Workflow is concerned, is a map image of that location, and so that is what it gives you. This means we can simply use the Get Current Location action, followed by Get Images from Input to get our map image for the current device location. You can use Workflow’s magic variables system to easily construct some more details, if you would like to share or save the image along with a location name, coordinates, or perhaps a Maps or Google Maps link.

Here is a simple version of the workflow that gets the map image and then lets you share it. Here is what the output looks like:

The workflow results in an image of a map of the current location.

If you have any questions about setting the workflow up for more specific scenarios or run into trouble with it, feel free to reach out to me.

The Sweet Setup on Saving Instapaper Highlights to Ulysses

I’ve been using Ulysses on iOS on and off over the past year, but I recently switched to it full time for my writing needs. It’s a powerful app with a lot to offer, like excellent support for automation and publishing, including publishing directly to WordPress, which I’ve found very useful over the last couple of months. Not to mention, it’s a beautiful app that makes your text look great. Instapaper is another beautiful app, which I’ve been using for several years, which makes other people’s text look great, by presenting articles in a beautified, simplified format, allowing you to read, save, share, listen to, speed-read, highlight, and annotate articles.

The Sweet Setup, last week, wrote about using IFTTT and Dropbox to automatically save Instapaper highlights to Ulysses for research. I’m currently doing research for a job I want to apply for, and really hope to get, so I decided to give this a try. I setup my IFTTT applet similar to how it is described in the link above. However, I changed it to use the Append to Text File Dropbox action, rather than the Create New Text File action. I also made another important change at the beginning. Rather than using New Instapaper Highlight as the trigger in my applet, I changed it to use the New Comment option instead. I’ve saved to Instapaper some articles related to this job position, a newer form of technology it involves, information about the future of the industry, etc. Instead of just highlighting important bits, I wanted to add a note for each bit I found important, stating why I thought it important, or some quick thoughts on how it is relevant. In Instapaper, adding a comment to selected text automatically highlights it as well and, if you’re using the comment option for the trigger in IFTTT, then both your note and the highlighted text are available. I setup my applet to format the highlighted bit as a markdown quote, with my note below it.

Configuration of an applet to save Instapaper comments to Ulysses (via Dropbox)

It’s been useful so far. If nothing else, just having my highlights and related notes in Ulysses is a good start. I do wish IFTTT would allow me to only include the URL and Title of the article in the first comment for a particular article, rather than each highlight however, I can work with that for now. I also setup and applet specifically for highlights (without comments), but I haven’t yet tested with both turned on, to see if they play nicely together on the same article (i.e. if I make some comments in an article, as well as some simple highlights without comments, if both will be appended to the same file, without duplicating the highlights when a comment is used, since comments also highlight the selected text. If anybody has ideas about that or has tried it, I’d be interested to hear. For my current mission, at least, the comment version will work sufficiently. Thanks to The Sweet Setup for another good idea and, if you haven’t before, you should check out the site.

Puerto Rico

We are on vacation in Puerto Rico at the moment; a much needed holiday. We are staying in more of a resort community, though we do plan to get out and see more around the island than this little bubble, including El Yunque rain forest, Laguna Grande (bioluminescent bay), Old San Juan, and the fortress San Felipe del Morro, along with various other random excursions throughout the week.

At the moment though, in our first full day, we are enjoying sun, beach, and pool. I’m taking in the beautiful vista of palm trees from in front of our hotel room as we prepare to head to the beach again. This should be a fun week.

iOS 11 beta after one week

Some quick thoughts on the iOS 11 beta, one week in.

Since iOS 8’s extensions, it’s been hard to imagine using iOS pre-8.0. The same thing happened with multitasking on the iPad in iOS 9. At this point, I couldn’t imagine using an iPad with iOS 10 again. I’ve only been using the beta for about a week and it’s been a joy so far… on the iPad.

In contrast, I had to reset all settings on my iPhone, and it’s still a bit slow. Also, I lost a few years worth of health data, so that’s been fun. The crashes I’ve experienced on the iPad have been relatively minor and are likely to be resolved in the next iteration. It’s worth it, to me, to have much smoother and faster operations and a significantly improved dock and multi-tasking system.

The app switcher, app spaces, the new dock, and being able to have a second app hover over and just swipe it off screen and out of the way are just some of the joys I’ve found so far in the new iOS 11 for iPad. And it really does feel like that: for iPad. I hope this is a sign of the future, marking a more prominent divergence in iOS between the two platforms. The iPad can be a lot better and that’s the direction iOS 11 seems to be taking us. I’m excited to see other apps begin to adopt these new features as we approach September, especially drag-and-drop. It might take some getting used to but I think anybody who does any work at all from an iPad will appreciate iOS 11.