For the last two years I keep saying the same thing about the Keynote: This is for the consumers and press, not the developers that signed an NDA. Much of what was said was hardly earth shattering or exciting for developers. Far more hardware that I would have expected, much of it shipping now. There were a few sneak peeks at new gadgets coming down the pike. Of all of them the HomePod, Apple’s smart speaker entry will garner the most press, since the press like to see everything as a competition, particularly with Amazon and Google.
There were many tidbits that sounded good, such as Amazon video on Apple TV and the adaptation of the iPad playgrounds keyboard for all apps. There were five things I thought about during the keynote that I think are the big takeaways, though they may not make the news as much. I also had some thoughts about the one last thing everyone will be talking about.
The Missing Apple TV
Under the “Coming later this year” phrase, all news except about Amazon Video was quietly swept under the rug for AppleTV. The last major Apple TV updates happen late August in their own event, which might be what is happening this time. The lack of news is a bit disturbing, especially when the only news was that an app that exists on the iPad now will be on Apple TV by winter. The lack of news gets my suspicion there’s some major hardware and OS work going on. I’d still like to see tvOS assimilate into iOS and be a low-end desktop iPad.
iPad is looking more like a Mac
With the inclusion of the dock on the iPad you get by swiping up from the bottom of the screen, the dragging and dropping of an app to make a new window for that app on top of another app, and the drag and drop of data between apps, the iPad is looking more and more like a Mac. Then there’s the newest feature: A true file system where you can access files from iCloud and from several third party cloud storage solutions such as Google Drive, Dropbox, and Box. This all blurs the line between tablet and laptop even more.
Markup on the iPad
Paradoxically, at the same time the iPad is looking more like an laptop, it also is looking more like a pad of paper. A new feature is Markup, which takes the tools already existing in the notes app in iOS10 and gives them even more life. You can get to notes directly from the lock screen by tapping with an Apple pencil. You can use the pen, pencil marker and eraser to draw or to write freehand text. In one of the more interesting machine learning aspects of iOS 11, the iPad can read your handwriting, and even search for words you hand wrote. Playing around with a beta version last night, my first words scribbled out on a notes pages were converted into a title for my note. Screenshots can be annotated by clicking a thumbnail of the sketch. Other text and windows can be converted to pdf and annotated there. You can even scan documents into notes, with distortion control and marking up the scan. This might seem like a small feature, but it brings some of the most powerful aspects of a piece of paper to the iPad. It gets the ideas out of people and onto a printed media – one easily and instantly shared by several people.
Virtual reality and augmented reality
If there is a theme at WWDC, it seems to be turning Apple devices across platform into virtual reality and augmented reality devices. This seemed to get a lot of the non-hardware time, with multiple demos and explanations. Apple did not however introduce any VR hardware, and left a lot of the heavy lifting to other, more standard engines. In my mind if you haven’t started to learn Unity or Unreal, it’s a good time to do so, since they seem to be the backbone of this effort, not an internal Apple product.
The Metal theme: Learning, not intelligence.
One of the frameworks mentioned frequently in and out of its usual context is Metal. Metal was a framework originally meant to run on a low level the Graphics Processing unit(GPU), A special CPU for rendering 3d graphics quickly without burdening the CPU. The GPU is a parallel processing chip, and as such can be used for other computations at high speeds the CPU could never manage. Apple has moved from using Metal as just a 3D rendering system to running the GPU as a second processor for fast computations. Metal is found in the VR and AR, and updating many of macOS’s older interfaces to a speedier, cleaner version.
Yet Metal is the bedrock for the answer to one of the biggest criticisms of Apple lately: Artificial Intelligence. Apple’s changed the focus here a bit, but gave developers the tools necessary to bring iOS and macOS intelligence by observing and learning, then making predictions from that learning. Apple is calling this Machine Learning, not Artificial Intelligence. There’s a difference here: Artificial Intelligence covers any case where something not human acts like a human. Machine learning is more specific. It is giving a machine data to come up with its own conclusions and predictions based on that data. Apple is banking on the more subtle world than a tube sitting on your desk or counter (more on this later) and talking to it, but instead making hundreds of interactions with your device a learning experience for your device. Ironically, while Apple will be blocking everyone else from getting a user’s personal data in Safari in intelligent ad blockers, they will be absorbing a lot more personal information for their devices to learn about their user and his or her preferences and habits. Machine learning is readily available to all developers, using Core ML and associated Frameworks like Vision which can do many forms of image detection easily and in only a few lines of code. The idea is not to just have a tube that answers your question, but that every app have some idea of what you want before you want it.
HomePod is a Smart Speaker, not a Smart Assistant
The one thing the press will criticize and get totally wrong is HomePod. The talking tubes that some claim are Apple’s competitors miss the point. Apple is not competing with them, and the WWDC keynote was making that abundantly clear for anyone who listens as well as Core ML. Comparing the HomePod to Alexia is like comparing Duck L’orange to candy bars. That’s not not the competition Apple is targeting: Bang and Olufsen is. The HomePod is not a intelligent assistant, it’s an intelligent wireless speaker. All the AI stuff was worth thirty seconds and one slide during a ten minute presentation on a beautifully sounding speaker. It’s not important to Apple now. No one should ignore Apple is a bunch of obsessive audiophiles. If they invented or popularized the smart phone with the iPhone, they are doing the same with the smart speaker as the HomePod. This is first and foremost a speaker, yes it is one to talk to, but the sound coming out has to have distortion levels so low as to make a grown man weep* at a mind shattering volume. It’s a speaker that figures out the acoustics of the room it is in, and then uses its speaker stack to make the best sound possible. The intelligent assistant doesn’t interest Apple as much, because all those Core ML Apps running on one’s phone, iPad and Mac will eventually do that for them.
For the last two years, I’ve come away from the Keynote a little disappointed, although I’m then blown away by the State of the platforms which follows it. The WWDC keynote is the stuff for public consumption, and has become the second string press conference for the bigger hardware announcements in late August or early September. Apple has several themes it follows which makes up its overall strategy. The first is to never reinvent the wheel. Apple will adapt from itself technology and will add technology from others to make the product that delights their consumers. The second is to give the absolutely best tools for 3rd party developers to rapidly develop fantastic products and get many first time developers working in their ecosystem. It is this second point that makes up the backbone of WWDC. Others have looked for more consumer oriented points in WWDC, but that’s not the point. The biggest cheers yesterday was about refactoring, not HomePod, iMac Pro, or Augmented Reality. Apple’s given the consumers and their distractions, now the rest of the week is for the developers.
Steve Lipton is the mind behind makeapppie.com for learning iOS programming. He is the author of several books on iOS programming and development. He is also a LinkedIn Learning author. His latest is a trilogy for iOS notifications: Learning iOS Notifications, iOS App Development: Notifications, and watchOS App Development: Notifications plus a quick lesson on delegation Swift Delegations and Data Sources
*Yes a quote from Douglas Adams. As long as HomePods they aren’t yellow and hang in the air the way a brick doesn’t I won’t get nervous.