WWDC 2017 – still on the hook

WWDC 2017. As with every Apple announcement we all lined up to see if it held any nuggets of greatness. General consensus from the wider community is that the announcement held some interesting prospects for the future especially the new frameworks for iOS, such as the Augmented Reality, Machine Learning and the HomePod; Here’s a quick summary of what we took from the WWDC 2017 announcement.

With this years Apple Worldwide Developer Conference WWDC 2017 already underway here are the six major announcements from Apple:

  • Apple TV to get an Amazon Prime App
  • Apple Watch to get some new watch faces and activity features syncing with gym equipment
  • macOS High Sierra & New iMac Pro
  • iOS 11
  • New 10.5” iPad Pro
  • HomePod

As I am a native mobile developer, I’ll focus on what’s new in iOS 11:

Core ML

The announced Core ML framework will bring machine learning capabilities to our apps. This works by applying a machine learning algorithm to a set of training data. The model makes predictions based on new input data. The Apple documentation for Core ML gives the example of being able to predict a house’s price given the number of bedrooms contained by using historical house price data.

Apple has been using it’s Machine Learning in it’s apps for a while now, specifically with the Photos app for facial recognition to create smart photo albums. Opening this framework up to developers to utilise is a game changer for doing machine learning locally on devices.

ARKit

Apple announced ARKit, a framework which will help add augmented reality to our apps. It does this by combining a live view of the device’s camera with virtual objects on the screen. One of the biggest difficulties in AR is positioning items on screen so that they look and behave correctly with the surrounding scene. This is where ARKit should help the developer.

In the simple demo given ARKit looked to be detecting a table surface where the app could the place an item on the table with the correct position and dimensions. The example game shown at the keynote used a table’s surface to display a highly detailed 3D game where the user walked around the table using their iPad to visualise the scene and play the game.
The best previous example of this technology was in Pokemon Go, the producers of this had to develop their game from scratch. Apple opening this up as a framework should shorten the time to market for other apps wanting to utilize this UX. It’ll be exciting to see what developers come up with using this framework in the coming months.

SiriKit

SiriKit has been updated with new domains and intents which allow it to do much more. These include:

  • Add notes
  • Transfer money using the new Visual Codes domain
  • To-do lists
  • Reminders
  • Expanded ride booking

While not completely open for developers, Apple has expanded the voice commands available to developers to offer their users. Siri has also been given an upgrade with more natural sounding voices along with better machine learning capabilities locally on devices to provide more contextually aware suggestions. Siri will also come with a translation service in beta to be able to ask Siri to translate a phrase. The beta will initially support English translations to Chinese, French, German, Italian, and Spanish, with more languages to follow.

Vision

The Vision framework provides a way for you to do high performance image analysis in our apps. This goes further than the face detection already currently in Core Image and allows us to detect so much more in our users’ photos such as:

  • Barcodes
  • Text
  • Image
  • Horizon

We’ll also be used to track images in video.The Vision framework can be integrated with Core ML to run custom machine learning models on images. Apple have also added QR Code recognition into the photos app so we’re sure to see a resurgence of QR codes in the wild.

AirPlay 2

The AirPlay protocol has been updated to version 2. The most significant upgrade here is the ability to stream audio to multi devices for multi-room listening. This is available to developers for third party apps. Apple has also brought speaker support to HomeKit to include in home automation ‘Scenes’.

Music Kit

Currently developers only have access to a user’s’ local music library. With iOS 11 MusicKit has access to the entire Apple Music library if a user is a subscriber, the example given at the keynote was Shazaam being able to offer the user the ability to install add the recognised song to their library. In theory, developers could create their own alternative to the default music app.

Drag and Drop

With the new drag and drop feature added to iOS 11 for iPad we can now easily add support to allow a user to drag items from our apps into other apps, or receive items from other apps.

An API will be required to support drag and drop but this addition makes the iPad a real multi tasking device.

Core NFC

A small feature but with huge potential, certain iOS devices will now be able to use their NFC chip for more than just Apple Pay! This has been requested by developers and users for as long as Apple have included NFC chips in their devices.

NFC could enable more ways for iOS apps to communicate with connected devices as well as the possibility of iPhones replacing NFC-based cards or public transport passes like London’s Oyster card. and I’ll be excited to see what uses developers can come up with for this.

Conclusion

This is just an overview of the major announcements made by Apple at theWWDC 2017 opening keynote, with much more details coming in the next week during WWDC which I’m sure while be exciting for all developers and we look forward to getting our hands on the new technologies and using them. Want to discuss? Contact us today!


Share this article

Subscribe to Our Thoughts