Original Post

Integrating Luna Display Into the Classroom

My wife gave me a Luna Display for my birthday and I have been really impressed with it so far. Luna is a USB-C dongle that plugs into a Mac. Using the companion app on iPad, you can access the entirety of macOS, wirelessly. 

public.jpeg

I cannot wait to use this device this coming school year. I teach in about five different classrooms which makes transporting my Mac cumbersome. It is heavy, always running out of battery, and is missing some of my iOS music apps like forScore and Tonal Energy. The light form factor of my iPad Pro is perfect for toting around the building in one hand.

There are a few problems with this. I am still faster on macOS for one. And more importantly, there is software on the Mac that I cannot take full advantage of on the iPad. I depend heavily on FileMaker for a tracking student data, assessments, and assignments. I have discussed that workflow on two episodes of The Class Nerd podcast. This episode on tracking student data and this episode, which is a bit miscellaneous. For whatever reason, keyboard input on certain parts of the user interface is slow on FileMaker Go, the iOS version of the app. Even though I can do 90 percent of the things I need to with my database on iOS, the typing speed slows me down.

Luna Display puts macOS right on the screen of my iPad.

Luna Display puts macOS right on the screen of my iPad.

It is for reasons like this that I am thrilled to use Luna Display in school next year. The experience of using the app is so smooth that at times I forget I am not using my Mac. Are there issues? Tons. But I stop to wonder every now and then why it is again that macOS can’t work with touch.

I also use a score editing app called Dorico on my Mac. Not as often as FileMaker, but enough that it is sorely missed on iOS. I have not tried to operate this application on the Luna Display but in full screen mode I suspect it isn’t so bad. Once I give it a shot I will report back.

Making Just Intonation Play Along Tracks for Your Performing Ensemble (Using Tonal Energy and GarageBand)

There are a few things that would be helpful to know about my music teaching philosophy before reading this post.

1. I believe that tone production, intonation, balance and blend are central to teaching performing musicians. I prioritize them much higher than fingering technique, rhythmic precision, and even reading comprehension.

2. The way I structure my band classes starts with, is focused on, and always revisits those core ideas.

3. I have accumulated a vast variety of tools and teaching strategies to meet my goals of having superior tone quality, intonation, balance and blend. One of the most essential tools I use is the Tonal Energy Tuning app.

Tonal Energy Tuner

What is Tonal Energy? A hyper charged, power-user app for musicians that has many advanced features, including...

- Tuning drones that can be triggered polyphonically

- Feedback as to how in tune a performer is, which includes a delightful happy face to depict good or questionable intonation

- Drones and feedback can be adjusted to different temperaments

- A metronome (with more features than nearly any alternative on the App Store) that can be used separately or at the same time as the tuning drones

- Analysis tools that depict amplitude and intonation on an easy to read visual graph 

- Recording and play back practice tools for musicians to listen back to their performance

- Automated metronome pre-sets that can be sequenced 

See the video below. I will first depict the tuner playing a Bb drone, then I will show how it can model a Bb major triad all at once. Then I will turn the tuner to just intonation mode, and you will hear that the third and fifth of the chord are appropriately adjusted so that they are in tune with the Bb root. Next, the video will demonstrate how the metronome can be used in combination with these drones.

Imagine now that a student is playing a scale along with Tonal Energy. By leaving the tuner in just intonation, and centering around the key area of Bb major, every note of the scale that I touch will resonate accurately with the Bb, giving the student an accurate reference to blend into.

Developing An Inner Ear for Diatonic Intervals

Much of music is made up of scales. For a student to learn how to most accurately tune different intervals and chords, I have the drone running in the background during most of my teaching in whatever key area we are working in. I then move my finger to the correct notes of the melody to model and reinforce what good intonation would sound like. See below for an excerpt of a song my beginning students might play.

In the video below, watch as I play this song by dragging my finger along to the melody. This happens with a metronome to reinforce the beat. I like that TE has the option to speak counts out loud. In my experience, this really reinforces a concept of strong beats, weak beats, where in the measure the performer is. Other tuning apps have the counting feature as an option, but the sounds in TE sound more natural and less computerized.

Making Play Along Tracks in GarageBand

As you can imagine, I am doing a lot of dragging my finger along while students play for me. This gets tedious. I also want my students to be able to hear these pitch relationships when they practice, so I have begun recording them into play along tracks. How do I do this?

Inter-App Audio Apps and Audio Extensions in GarageBand

In the iOS GarageBand app, audio input is usually performed using either software instruments or by recording audio directly into the device with the microphone. But what you might not know is that you can also create a track that is based on the audio output of a third party audio app. If you have ever used a DAW, think of Inter-App Audio Apps and Audio Extensions like plugins. Once launched, you are kicked into a third party interface (much like using a reverb plugin from Waves or a synthesizer from Native Instruments) which then adds to or alters the sound of your overall project. In a more recent GarageBand update, Apple categorizes Inter-App Audio and Audio Extensions under the External option when you create a new track. 

Audio Extensions are effects that alter your tracks like reverbs and EQs, while Inter-App Audio captures the audio of a third party app and records it into its own track in GarageBand. You can browse the App-Store for Audio Extensions that work with GarageBand. 

public.jpeg

Recording an Inter-App Audio App Directly Into A GarageBand Project

Watch in the video below as I set up an Inter-App Audio App track with Tonal Energy. What I am going to do next is press record, and record my justly in tune play along of Lightly Row into my GarageBand project. I will do this using the euphonium sound. The euphonium drone is one of the roundest, darkest, and fullest sounds, while also containing a great range, so it is effective for most instruments to play along to while also modeling a rich, full, resonant sound.

Accurate Note Input with MIDI Controllers

In this video, you can really hear how sloppy the transition from one pitch to the next is when I drag my finger. Notice also that I did not play repeat notes. It is difficult to play the same pitch twice in a row without Tonal Energy changing itself to that key area. One way around these challenges this is to set up a portable MIDI keyboard with Tonal Energy. The one I have settled in is the CME X-Key with Bluetooth.

It has a sleek look, is very small, and has low key travel. It has buttons for pitch shifting and octave jumping. And Tonal Energy adapts to it in just intonation mode! Watch in the video below. As I change which chord I am playing, TE automatically snaps the third and fifth of each triad in tune, relative to the root. For my Lightly Row performance, I can now hold a Bb drone on in one hand, while playing a melody in the other.

Embellishing The Track with Other Instruments

The resulting play along track is alone pretty useful for students. Let’s make it more fun by adding a drum track.

We can make it even more fun by embellishing with bass and other instruments. I like to change up the style of these play alongs. Sometimes I don't even pre-record them, I just improvise along with my students to keep things fresh. Be careful though. These software instruments are NOT justly in tune, so too many of them can defeat the purpose. I try to combat this by having the drone be the loudest thing in the mix. Notice in this recording I have tried not to create any motion in the accompaniment that interferes with the consonant intervals in the melody, so that the listeners ears can remain focused on the drone for their reference.

Conclusion

Well, that's it! I can trigger these in rehearsal, sectional, and even share them with my students for home practice. Regular practice with tuning drones has really turned around my band's sound, and gives students the foundations for long term ear skills that will help them to HEAR what is in tune, not just respond to the commands “you're sharp!” and “you’re flat!”

Spending Time with iPadOS 13

I have been running the beta of iPadOS 13 for almost a month now. iPadOS 13 ships this fall and is the first version of iOS that Apple is branding iPadOS because of its focus on features unique to the iPad. At first you might think this to mean that Apple is adding ‘desktop’ features to the iPad, but the more I think about it, the more I realize that the iPad is in many respects growing into a platform with its own unique set of strengths. Here are my favorite features so far.

New Home Sceen!

The first thing I really love is the new home screen. You can fit way more apps on it now, and they stay oriented the same way in both landscape and portrait because it is a 6x5 grid in either orientation. This wastes way less space on the screen and allows you to cram a lot more apps into a smaller space for extra productivity!

public.jpeg

Also useful is that you can pin your widgets to the left of your apps. I can now see my OmniFocus tasks, upcoming calendar events, recently accessed Files, and notes, every time I return to the home screen. For OmniFocus, I have it showing my Priority perspective, which shows all due items, soon to be due items, and flagged items that are tagged ‘Today.’ This is one more tool to help make sure I don’t let stuff slip through the cracks. The same could be said of the Calendar widget. Having the Files app display recently opened files on the home screen sure does feel a lot like being able to treat the home screen the same way I do my Desktop on the Mac.

desktop safari

The thing that is surprising me the most is how much the new Safari update transforms the way I use my iPad. Safari now runs like the desktop version. This means that websites operate as you would expect them to on the Mac. No more taking out your MacBook for those few websites that just never quite worked right on iOS. For me this is going to change the way I use a lot of my school district’s mandated learning management software, which would often not work correctly, or as reliably, on my iPad.

public.jpeg

But what is really great is that I can now access the full versions of Google Docs and Squarespace from my iPad. Google’s apps on the App Store are still a little nicer, but they have never had the full feature set of the web apps, and now this is nearly a non-issue. Apple and Google need to find out some way to better let users choose if a document opens in Safari or Google Docs/Sheets/Slides, but I expect that to be eventually ironed out.

Even more exciting is that I can finally use the full toolset of Squarespace to update my website on the iPad (just one of the few things that would keep me taking my Mac out of my bag). So far, Apple has already done a nice job with these features, and they are not even ready for public release yet. There are some issues and unexpected behaviors, but not nearly as much as I expected. Desktop Safari has turned out to be the biggest productivity boost of all the new features. And did I mention there is now a download manager!?

multitasking and pencilkit

public.jpeg

There are also some improvements to multitasking. Notice above that I am using two apps open side by side with another one floating in what Apple calls Slide Over view. iOS 13 now adds the ability to manage multiple different apps in Slide Over at once. The implementation is great. It works like multitasking on an iPhone X or higher. You can swipe the little handle on the bottom of the app left and right to page through recent apps, and you can swipe it up and to the right to see all recently opened Slide Over apps. This makes it much easier for me to manage the few apps I am using often in this mode: apps like Tonal Energy Tuner, Messages, and Twitter.

public.jpeg

I now also appreciate that you can have more than one instance of the same app open at the same time. Notice above that I am viewing two notes side by side. When I mentioned that iPadOS is growing into its own specific identity, the pencil tools on the right side of the screen are what I was thinking about. They have been brilliantly updated. And Apple is releasing them for use by third party developers in an API called PencilKit. Here’s to hoping that it is widely implemented so that using the Apple Pencil feels more consistent across apps.

See below also. Swiping from the lower left of the screen with the Apple Pencil allows your to quickly mark up whatever you are looking at. And if you are in Safari, you can now clip an entire website, not just what fits into the screenshot. You can highlight, annotate right from this screen and then send it somewhere like Apple Notes where you can search the article by text.

public.jpeg

For me it is becoming clear that PencilKit is a feature that is going to widely shape and define the iPad as a particular tool for certain jobs that a Mac or an iPhone is not as useful for. Apple is bridging the gap a little by introducing a feature for the Mac called Sidecar, where you will be able to send windows of Mac apps to the iPad to be able to take advantage of the same pencil precision editing tools.

Conclusion

Overall, iPad OS is shaping up to be an awesome release. I didn’t even mention half the features here. And even some of the ones I am most excited about will not reach their fullest potential until third party apps take advantage of them (like PencilKit) or until more people are on iOS 13 (like iCloud shared folders). If you are an iPad user you have a lot to look forward to this fall. If you want to try the beta, you can go here. It is pretty risky though, and I am admittedly very unwise for doing it.

A Blogging Experiment

Yesterday I posted The 7 Best Apple Home Devices on this blog. In part this was an effort to condense some of my intense study on the subject of home automation over the past four or five years so that someone could benefit from a broad-stroke overview of how I set everything up.

But this post was also 50 percent an experiment. Two summers ago, I posted The 6 Best Automation Apps for iOS. Strangely, this has become the most popular blog on my entire website, by far. This is despite it not really being about music or education, and despite the fact that blogs like MacStories pump out articles 100 times better on the subject, regularly.

My second most popular post is a video about indexing large PDFs using the musical score app forScore on iPad. It is far less popular from the post on automation, but still far more popular than anything I have ever posted. I feel like it represents my niche pretty accurately.

I did some thinking on what could have made my automation blog post so popular. Was it that the title is concise? Bold? Simple? Was it that it had a bite-sized, concrete, number of apps in that same title? Or was it that I successfully tagged the post so that it shows up in a lot of web searches? I tried to replicate a little bit of that format in yesterday’s home automation post, while still writing about something I am passionate about. We will see how well it does.

And please do tell me if the home automation post was helpful to you in any way.

The 7 Best Apple Homekit Devices

Learn about my smart speaker setup on this episode of my podcast:

Subscribe to the Podcast in... Apple Podcasts | Overcast | Castro | Spotify | RSS

I keep promising myself that a larger dive into my home automation workflow is coming to this blog. And it is. But I thought that I would first take a moment to outline the top seven apps and devices that I am using in combination with the Apple Home app. These get special attention given that their HomeKit integration allows me to conveniently manipulate them all from within the Apple Home app and command them with Siri. 

All of the devices in this post are also compatible with the Amazon Echo. I only buy home devices that are equally compatible because I use Alexa in my house as well. Furthermore, the home automation space is still very young and fragmented. The more open a platform is, the more flexible it will be now and in the future. 

Philips Hue Lights

Be careful. These WiFi connected light bulbs are the gateway drug of home automation. With them, I can now turn on every light in my house with my phone or voice. For my small house, the bulbs work just fine, but I would recommend the light switches for larger homes for convenience and to save money. Check out the image below to see how these lights appear in the Home app. I can control them individually or group them together. I can automate them by time or location in the Apple Home app. It's really nice to have the lights gently dim around bed time, and gradually wake me up with a gentle red hue an hour before work in the morning. Because my iCloud account also knows who and where my wife is, I can set up an automation that turns off all the lights once both of us have left the house, and another that turns them back on when one of us returns. 

Check out Philips Hue lights here

The Home app aggregates all of my various different home automation devices.

The Home app aggregates all of my various different home automation devices.

The Good Morning scene is automatically set to run at 6:30 am on weekdays and at 9:30 am on weekends.

The Good Morning scene is automatically set to run at 6:30 am on weekdays and at 9:30 am on weekends.

This is the set up screen for my Good Morning scene.

This is the set up screen for my Good Morning scene.

Ecobee Thermostat 

The Nest thermostat was the first home automation device I ever bought. It doesn't work with Apple HomeKit though. So when it unexpectedly died last year, I jumped at the opportunity to try something new. Ecobee thermostats are the best around. Speaking into the thin air "Hey Siri, I'm cold" to turn up the heat is a modern day dream. Of course, I can automate temperature in all of the same ways I can do lights. And I can even group these devices into "scenes" in the Apple home app to streamline frequent actions. For example, the "Arriving Home" scene turns on the air and the lights. This scene is not only triggered by button or voice, but automatically runs when my phone is within close proximity to my house. 

This is what you see when you open the ecobee app.

This is what you see when you open the ecobee app.

Once you tap on a thermostat, you get more detailed controls.

Once you tap on a thermostat, you get more detailed controls.

This is my Arrive Home scene. The door unlocks for me, the thermostat turns on a good temperature, and the lights on the main level of the house turn on.

This is my Arrive Home scene. The door unlocks for me, the thermostat turns on a good temperature, and the lights on the main level of the house turn on.

Schlage Sense Door Lock

My Schlage Sense allows me to unlock my door with the tap of a button. My teaching studio is in the basement of my house and the door is upstairs. It is disruptive to a lesson to constantly be answering my door, so now I just tell my Apple Watch "Hey Siri, unlock the door." It authenticates through contact with my wrist and completes the task. Of course my Arriving Home and Leave Home scenes also unlock and lock the door, in addition to all of the other actions I mentioned above. Having my front door unlock for me when I arrive home makes me feel like I am living in the future. Having it automatically lock when I leave gives me peace of mind that my house is safe. 

Logitech Circle Camera

Of all the HomeKit devices out there, cameras are the hardest to shop for. I have found the Logitech Circle to be the best out there. Nest makes some great cameras but their lack of HomeKit support has driven me away. I have the Logitech set up in our dining room, facing down the primary hallway in my home. It is plugged into an iHome smart plug which is also HomeKit enabled so that I can turn it off and on remotely. This plug is automated in the Home app to turn on when neither my wife and I are home and turn off when one of us arrives home, therefore working like a security camera. When it detects motion it turns on our dining room and kitchen lights. It has a two way microphone so you can chat with someone in your home if you need to. And what I love about it most is that the camera feed shows up right in line with my other smart home controls in the Apple Home app. 

The interface for the Logi Circle 2 app.

The interface for the Logi Circle 2 app.

Eve Sensors

Sensors need no introduction. These things can trigger any other home device to act when they detect motion. Most of mine are set to turn on the lights in a given room when I walk into them. But they can also trigger thermostats and smart plugs. My favorite sensors on the market are made by eve. They are easy to set up and work reliably. Eve also makes a number of other interesting HomeKit products. 

Sensors appear as ‘Triggered’ in the Apple Home app when they have detected motion.

Sensors appear as ‘Triggered’ in the Apple Home app when they have detected motion.

In Apple Home, I can make an automation that turns on the upstairs light whenever my eve sensor is triggered upstairs.

In Apple Home, I can make an automation that turns on the upstairs light whenever my eve sensor is triggered upstairs.

The eve app makes a great alternative to the Apple Home app for controlling all your devices.

The eve app makes a great alternative to the Apple Home app for controlling all your devices.

iHome Smart Plugs

I like using smart plugs as an all purpose way of turning on and off the things in my house that are otherwise not “smart.” In addition to the camera workflow I mentioned above, I also have these plugged into other devices throughout the house. For example, my bedroom fan is plugged into one. I can now turn it on and off in the middle of the night without getting up. “Hey Siri, turn on the fan.” A lot of brands make smart plugs but the iHome is the easiest to set up and use in my experience. 

Apple HomePod

I was hesitant about the HomePod at first given that it shipped with incomplete software and relies entirely on Siri for voice commands. Still, the device offered some compelling features. When iOS 11.4 brought the features that were missing from release (AirPlay 2 and multi room audio), I scooped one up while Best Buy was running a 100 dollar off deal on them, refurbished. 

The HomePod fulfills a lot of the same purposes as the Amazon Echo. It is distinguished by linking into the Apple ecosystem, allowing me to command Apple Music, Apple Podcasts, and all of the home automation devices mentioned above. 

Control of the HomePod exists inside the Apple Home app where it appears as a speaker device. The recent addition of AirPlay 2 allows my two Sonos One speakers to show up in the Apple Home app as well. 

The HomePod is first and foremost a good speaker. But it can also command your other speakers in the house and even the audio output of your Apple TVs. Simply command “Hey Siri, move this music to the living room,” and listen as your music is magically transported from one speaker to the next. You can output your Apple TV audio through to this handy speaker and speak playback commands to your tv and movies with statements like “pause,” “stop,” and “skip ahead 50 seconds.”

The HomePod is the core of the Apple Home experience. Of course, you could just as easily control every device in this post from an Echo. However, as an Apple Music subscriber, and frequent listener to podcasts in the kitchen, having a HomePod makes sense for me to own.

It looks like the investment is going to pay off. This fall, iOS 13 will be adding even more features to the HomePod and Home app. For example, the HomePod will be able to distinguish between my voice and my wife’s. This way, when she asks it what is going on today, it will read from her calendar instead of mine. iOS 13 is also introducing speaker automations for scenes. So my Good Morning scene in the Home app will now play my favorite breakfast playlist in addition to turning on select lights and changing the temperature.

And finally, HomeKit automations and Siri Shortcut automations are going to be better tied together, and will be able to be triggered automatically. For example, doing something like stopping my wake-up alarm will both run the Good Morning scene and automatically run this Siri Shortcut that tells me how I slept, delivers a weather report, and opens a meditation in the Headspace app.

In iOS 13, HomePod play controls show up right in the Home app.

In iOS 13, HomePod play controls show up right in the Home app.

In iOS 13, music playback can become part of your scenes.

In iOS 13, music playback can become part of your scenes.

The new Siri Shortcuts app on iOS 13 integrates home automations and personal automations. It also allows them to be automatically triggered by time, location, opening a particular app, and more! In this example, stopping my wake-up alarm triggers m…

The new Siri Shortcuts app on iOS 13 integrates home automations and personal automations. It also allows them to be automatically triggered by time, location, opening a particular app, and more! In this example, stopping my wake-up alarm triggers my I’m Awake Siri Shortcut, which sets the Good Morning scene, reads me the weather, tells me how I slept, and starts a meditation.

The Break Up of iTunes, a musician’s perspective

Piggybacking off of yesterday’s post, there was a rumor earlier in the month that the next wave of iOS apps to come to the Mac are Apple’s very own media apps.

9to5mac.com - Next major macOS version will include standalone Music, Podcasts, and TV apps, Books app gets major redesign | Guilherme Rambo:

Fellow developer Steve Troughton-Smith recently expressed confidence about some evidence found indicating that Apple is working on new Music, Podcasts, and perhaps Books apps for macOS, to join the new TV app.

I’ve been able to independently confirm that this is true. On top of that, I’ve been able to confirm with sources familiar with the development of the next major version of macOS – likely 10.15 – that the system will include standalone Music, Podcasts, and TV apps, but it will also include a major redesign of the Books app. We also got an exclusive look at the icons for the new Podcasts and TV apps on macOS.

I have been arguing that iTunes should be broken up into separate apps on the Mac for years. As a musician and teacher who is an absolute iTunes power user, and who depends on music library management tools, I thought it was worth digging into the implications of this a little bit.

If you are a podcast listener, and have room in your diet for some shows that discuss Apple technology, there was an astoundingly good conversation about this topic on last week’s episodes of Upgrade and ATP. Both shows discuss not only the implications for the future of iTunes, but for the very nature of the Mac itself.

I am so excited for the TV app and the Podcast app to get their own attention. They have been much needed for a long time. I imagine Podcasts will be solid out of the gate. I will kind of miss the current TV app icon on iOS and the Apple TV but I understand that Apple needs to brand it with their logo since it is going to be coming to third party TVs and Amazon Fire products this fall with the launch of their new TV service. I don’t know how a Books app based on the iOS version would work with my imported PDF book library, but it is already wildly inconsistent between iOS and macOS so I cannot imagine it could get any worse.

iTunes is a place that I have traditionally relied heavily upon to organize my music library, recordings of my ensemble, and video performances of my concerts. I detail my entire music and video workflows in my book, Digital Organization Tips for Music Teachers.

iTunes is the only app that allows me to store my personal library alongside a streaming music library, and sync it across multiple devices. This is what has set it apart from Spotify for me over the past few years. iTunes also has some great video organization tools. For years now, I have organized all video of my school ensemble’s live performance (amongst numerous other musical performances and home video) into the video section of iTunes, and then pointed a Plex server towards the folder of files so that I can stream them from my Apple TV and iOS devices on the go.

If the new Music and TV apps are just like their iOS counterparts, there are a whole lot of features I depend on that could potentially get ditched. Here are a few of them...

-Importing my own music. The iOS version of music can’t even import a song. That’s right! If I buy an album on Bandcamp, or take an audio file of a professional band performing a tune my ensemble is working on, I can drag them right into iTunes on my Mac, and they will sync to my mobile devices over Apple Music Library. I would imagine Apple has to have at least figured this one out for iOS if they are going to ship this app on the Mac in the fall.

-Metadata control. It would be a sad day if I could not press the info button on a song, add my own comments, rating, and adjustments to the title, album name, etc.

-Smart Playlists. Jazz and classical recordings are notoriously difficult to manage in iTunes because of how complex their metadata is. In addition to editing artist and album information in these recordings, I have spent some time adding extra info to the comments section of my songs and then creating smart playlists to filter them. If Miles Davis is tagged in every recording he sat in on, you can make playlists like ‘Songs Miles Soloed On Between 1961-75.’

-Adding video. QuickTime (much like Preview) is an app that exists only on the Mac, because it is natively built into iOS whenever you tap a media file (or PDF in the case of Preview). Apple never had a dedicated app for managing video (although there is the awkward iMovie Library feature which has an arbitrary file limit). That said, iTunes is a pretty great utility for this purpose. I would hate to loose its video management features, even though they were never on iOS to begin with. The TV app is looking more and more like it is built to fulfill Apple’s TV strategy, which is to aggregate as much TV and Movie content from as many providers as possible, into a unified entertainment service. Don’t get my wrong, I am excited, I just don’t see myself using it to organize recordings of my band concerts.

Presumably iTunes isn’t going anywhere any time soon. For these legacy features, and including the need to sync older iOS devices to a Mac, I imagine it will still come on the Mac, buried in the Utilities folder, for years to come. Hopefully, users will still be able to do actions like I listed above in iTunes, and enjoy the benefit of them in the new Music app.

In conclusion, I remain highly cynical about this transition because Apple does not seem interested in making good apps in recent years. Conversely, I am enthusiastic about the long term benefit. If Apple developers are writing code for just one version of their apps instead of two, it is more likely that iOS versions of software will get elevated. That is exciting, even if it means that the Mac apps cannot do all of the same stuff they could always do at first. Coupled with rumors that Apple is going to release an ARM based Mac in the near future, I would like to believe that years down the road, we will be getting closer to a shared app platform between all Apple devices, with feature parity, and less distinction between input devices and which hardware its running on.

iOS apps I would love to see come to the Mac, a musician’s perspective

There has been a lot of buzz lately around ‘Marzipan,’ a set of developer tools that Apple is making to help third party developers more easily port their iOS apps to macOS. It is heavily rumored that Apple will detail this initiative at their developer conference, WWDC, this June (during the keynote on June 3rd).

Last year at WWDC, Apple unveiled four Mac apps that use this new set of tools to bring iPad-like versions of iOS apps to the Mac. The apps launched were Home, Stocks, Voice Memos, and News. The apps have been met with much criticism for their lack of Mac-likeness. For example, when you double click a news article in the News app, you can't see an article in a separate window, a behavior you would expect from the Mail app or Notes app on Mac. Likewise, the Home app, when setting up a time based automation displays they iOS style date picker, with scrollable numbers, rather than the calendar like interface that you would see when selecting a date in traditional Mac apps.

I agree that these four Mac apps are garbage, but I would much rather have the utility of them than not. Even if all these Mac apps do are act like iPad apps that accept input from a cursor instead of a finger, I would still kill to have any of the following on macOS:

-Tonal Energy Tuner. There are no tuning drone based apps, even on the web, that do 1/100th of what this iOS app does. My Mac is my primary device for sharing audio and visuals with my students during class. This would get used every single day.

-forScore. I have a weird way of managing my digital sheet music using the file system of my Mac, but then importing duplicate copies into my iPad’s forScore library. It would be really nice to have one place where this is all managed across all devices. Of course, this would require forScore to sync a library across devices, which the team has told me is too difficult a task to prioritize currently.

-Twitter. Twitter killed their Mac app recently and as someone who recently started using their app on iOS (Tweetbot is still far better but Twitter no longer provides the proper APIs for them to stay up to date on modern features), I would really prefer to not use the web browser on the Mac.

-Apollo. To my knowledge, there has never been a good Reddit client on any non-mobile device. Apollo is great.

-Facebook Messenger. I hate Facebook but it is a necessary communication tool. I would love to use it for that without going to their stupid website ever again.

-Overcast. My favorite podcast player. Would love to have it on Mac.

-Health. An app that excels in showing me data on graphs and charts sure would be useful on the big screen of a Mac.

-Due. My favorite reminder app is already on Mac but it looks gross.

-Instapaper. I use ReadKit on the Mac as an Instapaper client on Mac now, but would not mind something more minimal. Instapaper is the perfect candidate for a Marzipan app for its simplicity.

-Instagram. Who wouldn’t want this on Mac?

-Tempo. There is only one good metronome app on the Mac (Dr. Betotte). Opening up UIKit to Mac developers would bring a whole lot of competition in this space. Frozen Ape’s Tempo would be my first choice to get ported over.

-AnyList. Their Mac app is already just a gross port of their iOS app. Using Apple’s tools would surely make it prettier and more responsive.

-Ferrite Recording Studio. My podcast audio editor of choice is only on iPad. It sure would be cool to use these tools on a bigger screen with keyboard and mouse.

Email

I was ‘triggered,’ so to speak, by this New York Times Op-Ed over the weekend —> No, You Can’t Ignore Email. It’s Rude. 

 

After reading it, I was admittedly less put off by the content than I was the headline. It’s a short one, so I wont even quote any of it here. Just read it.

 

I have have had a particularly rough year with email, mostly because I have had a rough year with time management. Simply put, I bit off more than I can chew this school year. I have had more instances of emails collecting dirt at the bottom of my inbox for weeks, than ever before, and this poor practice has even started to bleed into my text message conversations, which I often claim is the far easier way to get a response from me. It still is, but my lack of ability to respond is obviously due to time management, not email.

 

Or is it? Email, by nature, is still a part of the problem. Email is so flexible a tool, and used for such widely different purposes, that it is hard to prioritize its content. And everyone has different email practices, expectations, and writing styles, that it is impossible to know how to please anyone. I prefer the efficiency of digital text over phone when possible. But my tone comes across indisputably better in person than in email, in which I am short and to the point when I am crunched for time. It would be easy to think I am mad at you from my email messages, if you know me personally, and I am responding briefly. 

 

Tools like SaneBox and email apps like Spark mail help. Snoozing message, defering todo emails to OmniFocus as tasks, and filtering my inbox are things that have cut my email time down by an average of six hours a week. Replacing email with Slack and Trello on my music team has also helped tremendously. And using TextExpander to type default messages to parents also cuts down on hours. If you are interested in these strategies, I welcome you to check out my podcast, The Class Nerd. Episode 1 and 2 on email, episode 5 on team communication, and a forthcoming episode on parent communication tools.

 

So why am I still stressed? I find that 90 percent of the time it is due to getting ‘stuck’ on certain messages. Messages that require a careful answer, the tone to be crafted precisely, not knowing the proper conventions to which someone desires to be replied to, etc... I never get stuck on emails that are a keystroke and a click away from being dealt with, or deferred.... but I guess that’s the point of the Op-Ed. You can’t defer ‘people.’ And the emails that require the most human touch are the ones I get hung up on.

 

But still, I find the mixed conventions of email utterly perplexing. Do you expect that I reply within a day? An hour? A week? Do you want me to address the message Dear ___? Would you like me to address you with an introductory sentence? A closing thought? An email signature? Would you rather me tell you I got it, even if a proper response can’t be delivered for weeks? And how do I deal with email while still actually doing my job (which is music teaching, by the way, not sitting in front of a computer screen)? How many times a day should I check my email? Should I leave the notifications on? Should I even have the app open all the time in the first place? If I open it intentionally, how many times a day, and when? How do I respond like E.B. White when I perceive others to expect more in the modern age? 

 

At the end of the day, I think this article is a little unfair. I do not think that everyone deserves my attention, and they definitely don’t get it when they want it. But there are also some clear examples in my life of when my slow email response to others is inconvenient and disrespectful to them. So, even if I cant find a good answer to the questions in the previous paragraph, what do I do during weeks like these past few, when I am hopelessly behind? The article had a good idea: Recommending to others, when you are behind, that they find some other channel to reach you... a Slack channel, Twitter, post-it notes, etc... I love this idea, but Slack and Twitter don’t seem professional for school use.

 

So here is my proposal. A free app idea for anyone reading... I would like an app that... 

 

1. Has a user interface like a chat app.  

2. Allows anyone to reach me. 

3. Can interface with my SMS but does not give others access to my cell phone number. 

4. Has ‘office hours,’ meaning that messages don’t go through to me during hours I set. 


Something like the Remind app but that works in the opposite direction. I can give someone a link in my email signature, and they can message me through it informally, and expect at the least, a quick “I got it.” Know of anything?

Editing audio with toUch gestures and apple pencil

I Tweeted a thread yesterday about the superiority of editing audio on a touch screen by manipulating regions directly with my fingers and an Apple Pencil.

It became clear that it wasn’t suitable to explain just how comfortable and precise the experience is in written text. So I have done my very best to communicate it through video. Enjoy…

Ferrite is still a very immature DAW by modern day professional standards. And the iPad is still a very immature platform for creative professional software by modern computing standards. What the iPad really needs is something like Logic Pro X. Apple has got to set the standard for professional apps by releasing their own.

Logic or not, Ferrite can get the job done. And gesture/stylus support aren’t the only things that make editing on an iPad more fluid. The form factor of a tablet makes it perfect for “couch-editing.” Plus, iPad is a lot easier to pick up and move from room to room while I am listening to hours of my own podcasts play back.

Favorites of 2018

Earlier this month I blogged a series of ‘Favorites of 2018’ posts where I reflected on the albums, live music, apps, and things that made my year a better one.

Now you can conveniently access all four of those posts from this one. Click below to read about my…

Favorites of 2018: Favorite AlbumsFavorite Live MusicFavorite AppsFavorite Things