Saturday, 31 October 2015

On-Demand Resources on tvOS

TL;DR - tvOS app bundles are limited to 200MB and there is no storage on the Apple TV, so if your apps requires more than 200MB of static assets the option is to use On-Demand Resources
iOS 9 introduced on-demand resources (ODR). This is useful on iOS to reduce the download time of your application so that the user can start using your app quicker and so that the app can be downloaded on a mobile network connection. These benefits are nice things to have but generally not key requirements for an iOS app MVP.
However, on tvOS the limit per app is 200MB (compared to 2GB on iOS) meaning that there is a good chance that you will need to use this feature if your app needs many resources as there is no way to persist large amounts of data on the Apple TV as explained previously. This is probably due to the fact that the Apple TVs capacity is either 32GB or 64GB, which is not that much when considering the Apple TV will be used for content-heavy and/or entertainment apps. Other relevant sizes for on-demand resources are defined on the documentation:
ItemSize
tvOS App bundle - The size of the sliced app bundle downloaded to the device200MB
Tag - A string that identifies a set of resources512MB
Initial install tags - The total sliced size of the tags marked for initial install2GB
Initial install and prefetched tags - The total sliced size of the tags marked for initial install and the tags marked for prefetching.4GB
In use on-demand resources - The total sliced size of the tags in use by the app at any one time. A tag is in use as long as at least one NSBundleResourceRequest object is accessing it.2GB
Hosted on-demand resources - The total pre-sliced size of the tags hosted on the App Store20GB
As recommended on the developer forums:
If you have content that is required for initial app launch, you can include up to 2GB of tags as initial install, which means the assets are included in the initial download from the App Store before the app can be launched. You can also mark assets for "prefetch", which means that the app can be launched even if they haven't been downloaded, but that tvOS will automatically start downloading them in the background when the app is purchased.
Tags are used to identify a set of resources that need to be downloaded, note that resources could belong to multiple tags. Even though tags are limited to 512MB it is probably best to have much smaller tags otherwise your app will need to wait longer while all the pack is downloaded.
At any one time the app can use up to 2GB of static resources which is plenty even for the graphic intensive games. It is important to note that this extra data is cached by the OS, so the app can request when it needs resource and the OS will either download them or retrieve them from a cache. This means that depending on the disk space left on the Apple TV resources may be needed to be re-downloaded frequently or not. This is important to understand when designing your app around ODR since there is no guarantee that resources will be immediately available when requested if previously downloaded.

Implementing on-demand resources

Apple has made available a guide to downloading and accessing on-demand resources that answers most questions. Luckily, it is quite straightforward to setup this and test on Xcode.
Firstly, on your xcassets set the resources tags the images belong to (images can belong to multiple tags):



On the project settings define which tags need to be "initial install tags", "prefetched tags" or "download on demand tags".

Note that it does not look like these states can be simulated as when running the app for the first time all resources appear as not downloaded regardless of the which category they belong to.

To download the images from a tag 
let tagString = "RedTag"
// Create a resource request with the required tag(s)
let resourceRequest = NSBundleResourceRequest(tags: [tagString])

// Optionally define the loading priority if downloading multiple resources,
// use NSBundleResourceRequestLoadingPriorityUrgent if resources are required now
resourceRequest.loadingPriority = NSBundleResourceRequestLoadingPriorityUrgent

// Check if the resources are cached first
resourceRequest.conditionallyBeginAccessingResourcesWithCompletionHandler({ (resourcesAvailable : Bool) -> Void in
  if (resourcesAvailable) {
    // Do something with the resources
    print("Resources originally available")
  } else {
    // Resources not available to they will need to be downloaded
    // using beginAccessingResourcesWithCompletionHandler:
    print("Resources will be downloaded")
    resourceRequest.beginAccessingResourcesWithCompletionHandler({ (error : NSError?) -> Void in
      if (error != nil) {
        print("Failed to download resources with error: \(error)")
      } else {
        // Do something with the resources
        print("Resources downloaded successfully")
      }
    })
  }
})
Optionally set the preservation priority of a tag so that the operating system will know which resources to purge first.
let preservationPriorityForRedTag = 0.5
NSBundle.mainBundle().setPreservationPriority(preservationPriorityForRedTag, forTags: [redTagString])
Once the resources are available, they can be accessed as they normally would, for example:
let redImage = UIImage(named: "Red.png")
[redTagString])
Once the resources are no longer used Apple recommends to call endAccessingResources() on the request, Call this method as soon as you have finished using the tags managed by this request. If needed, this method will be called by the system when the resource request object is deallocated.
The progress of resources requests can be observed using KVO on the fractionCompleted property of the progress of the resources request, this progress can also be used to manage the request state, for example:
resourceRequest.progress.pause()
resourceRequest.progress.resume()
resourceRequest.progress.cancel()
Resources can be purged from the simulator easily from Xcode:

Demo

The basic code to setup ODR explained on this article is on GitHub. Generally speaking, it may be more appropriate to wrap the methods of ODR in an object that is custom to your needs. This way your app can manage the different states of the tags as they are needed. This is important as the priority of the resources being used will vary throughout the lifecycle of any application and ideally it would be best if the OS does not purge resources that are currently being used.

Saturday, 24 October 2015

Storing your data on tvOS, part 1

tvospost

tvOS storage options

Continuing with our tvOS series ( post 1, post 2) we shall look at data storage options on tvOS in this two part article. This article is not specific to tvOS, however the limitations described here are related to the Apple TV and may be different on iOS or Mac OS X devices.
There are three main ways of storing data on tvOS:
  1. The first one is the well known NSUSerDefaults. While the official documentation is rather silent on this subject, it seems that this method's limit on tvOS is 500 KB according to Apple.
  2. The second method is iCloud Key Value Storage (KVS). This works in a similar fashion to NSUserDefaults but it's stored in the cloud instead. Furthermore, there is space for up to 1 Mb and data can be shared between tvOS and iOS devices if desired. So if you need more than 500 KB and / or you want to share data among devices, this is a method worth exploring.
  3. The final method discussed in this article, is CloudKit. CloudKit allows you to store as much as you want, the only limitation being the user's iCloud storage limit. We shall discuss CloudKit in detail in the second part of this article. If you need to store relatively large amounts of data, this is clearly the preferred method.

NSUserDefaults

NSUserDefaults on tvOS works in the same way it does on Mac OS X or iOS. You must obtain the NSUserDefaults singleton, and then you can save or retrieve key-value pairs.

 Saving and retrieving values

After you've obtain the NSUserDefaults singleton you can call setString, setBool, setObject, etc to store key-value pairs. To retrieve values you can call stringForKey, boolForKey, objectForKey, etc. For example:
Saving:
NSUserDefaults.standardUserDefaults().setObject(textField.text,
            forKey: textKey)
NSUserDefaults.resetStandardUserDefaults()/
Retrieving:
guard let loadedText = NSUserDefaults.standardUserDefaults().objectForKey(textKey)
 as? String else {
            return
}   

iCloud KVS

You can think of iCloud KVS as a sort of NSUserDefauts in the cloud. On tvOS you get twice as much storage space (up to 1MB) and the ability to sync data among devices running iOS or tvOS.

Enabling iCloud KVS in your app

Go to Xcode and enable the com.apple.developer.ubiquity-kvstore-identifier entitlement for your app:

Saving and retrieving values

In a similar fashion to NSUserDefaults, iCloud KVS uses the singleton NSUbiquitousKeyValueStore to store and retrieve values based on keys.
To save a value into the key-store you can get the shared delegate and then call the appropriate method, setFor... String, Bool, Object, etc.
For example:
let store = NSUbiquitousKeyValueStore.defaultStore()
store.setString(textField.text, forKey: textKey)       
To retrieve a value, you use the same singleton, but call setFor.. String, Bool, Object, etc methods. For example:
let store = NSUbiquitousKeyValueStore.defaultStore()
textField.text = store.stringForKey(textKey)

Keeping track of value changes

When using iCloud KVS, any device that is running your app can change the stored values. Furthermore, if you're using the same app identifier between your iOS app and your tvOS app, they can share data through iCloud KVS.
If you exceed the maximum iCloud KVS capacity, the notification will be issued with a value of NSUbiquitousKeyValueStoreQuotaViolationChange.
It's therefore convenient to register for the NSUbiquitousKeyValueStoreDidChangeExternallyNotification notification as soon as possible. In this way, you make sure your app starts off with the latest available data. For example, in the demo project, I've registered in the AppDelegate:
func application(application: UIApplication, didFinishLaunchingWithOptions launchOptions: [NSObject: AnyObject]?) -> Bool {
    
    NSNotificationCenter.defaultCenter().addObserver(
        self,
        selector: "storeDidChange:",
        name: NSUbiquitousKeyValueStoreDidChangeExternallyNotification,
        object: NSUbiquitousKeyValueStore.defaultStore())
    
    // Download any changes since the last time this instance of the app was launched
    NSUbiquitousKeyValueStore.defaultStore().synchronize()
    
    return true
}   
In the following example, we print the keys that have changed on iCloud's server:
    
func storeDidChange(notification: NSNotification) {
    guard let userInfo = notification.userInfo else {
        return
    }
   
    guard let reasonForChange = userInfo[NSUbiquitousKeyValueStoreChangeReasonKey] as? NSNumber else {
        return
    }
    
    // Update local values only if we find server changes
    let reason = reasonForChange.integerValue
   
    if reason == NSUbiquitousKeyValueStoreServerChange ||
        reason == NSUbiquitousKeyValueStoreInitialSyncChange {
            
            // Get Changes
            let changedKeys = userInfo[NSUbiquitousKeyValueStoreChangedKeysKey] as! [AnyObject]
            let store = NSUbiquitousKeyValueStore.defaultStore()
            
            // Print keys that have changed
            for key in changedKeys {
                let value = store.objectForKey(key as! String)
                print("key: \(key), value: \(value)")
                
            }
    }
 }

Resolving conflicts

When a device attempts to write a value to iCloud's KVS storage, iCloud will check to see if any recent changes have been made by other devices before storing your value. If changes have been made recently, iCloud will issue a NSUbiquitousKeyValueStoreDidChangeExternallyNotification forcing your app to update itself with the server's data. You must then decide how the server values should be dealt with. For example, if your app is a game and tried to update the user's points to say 1000 but received a notification with the server data being 1500, you will assume that the server data is more recent and that your app should therefore take 1500 as the most up-to-date value instead of forcing the server to store 1000.

Demo

Everything discussed in this article can be found in a demo project in github here: Github

Saturday, 17 October 2015

Interacting with the new Apple TV remote

TL/DR: In simple terms the user can interact with Apple TV remote by pressing a few buttons, using the single-touch trackpad or the accelerometer.
Following from last weeks post by Nahuel, this post continues explaining our experiences with developing for tvOS. Specifically, covering the various available user inputs from the new Apple TV remote with an example demo project. Aside from the accelerometer inputs, which need a physical remote, everything else can be tried on the simulator. Generally speaking, it is possible to access most buttons on the remote and events on the Glass Touch surface (trackpad part of the remote). However, as of tvOS beta 3 the microphone on the remote is not accesible even though it would be very useful, for example for dictation.

Gesture recognizers

Currently, the gesture recognizers supported by tvOS are: 
  • UITapGestureRecognizer
  • UISwipeGestureRecognizer
  • UIPanGestureRecognizer
  • UILongPressGestureRecognizer
All the UITapGestureRecognizer events and the UILongPressGestureRecognizer can be used with the old remote since that remote can be linked to the new Apple TV.

Tap gesture recognizer example

To get the events from the buttons pressed on the Apple TV remote a UITapGestureRecognizer can be used, the Apple TV remote supports 7 different taps corresponding to the buttons on the remote or taps on a specific part of the remote trackpad (refered to as the Glass Touch surface) . The current supported press types (from the documentation) are:
public enum UIPressType : Int {
    case UpArrow
    case DownArrow
    case LeftArrow
    case RightArrow
    case Select
    case Menu
    case PlayPause
}
For example, the UIPressType.Select would be set on the UITapGestureRecognizer to capture the select button pressed:
// Add the gesture recognizer
let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: "selectTapped:")
let pressType = UIPressType.Select
tapGestureRecognizer.allowedPressTypes = [NSNumber(integer: pressType.rawValue)];
view.addGestureRecognizer(tapGestureRecognizer)

func selectTapped(tapGestureRecognizer : UITapGestureRecognizer) {
    // Handle select button tapped
}
Note that when using this on the simulator, the arrow taps can be triggered using the simulated Apple TV remote or the arrow keys from the physical keyboard.

Swipe gesture recognizer example

Detecting swiped on the trackpad part of the remote can be achived with a UISwipeGestureRecognizer, the current supported swipe types are the same as iOS, i.e. RightLeftUp and Down. For example:
// Add the swipe gesture recognizer
let swipeGestureRecognizer = UISwipeGestureRecognizer(target: self, action: "rightSwipe")
let direction = UISwipeGestureRecognizerDirection.Right
swipeGestureRecognizer.direction = direction
view.addGestureRecognizer(swipeGestureRecognizer)

func rightSwipe(tapGestureRecognizer : UITapGestureRecognizer) {
    // Handle right swipe
}

Pan gesture recognizer example

The trackpad can be used to for tracking the relative position of the user's finger. A UIPanGestureRecognizer is used for this, for example:
let panGestureRecognizer = UIPanGestureRecognizer(target: self, action: "userPanned:")
view.addGestureRecognizer(panGestureRecognizer)
  
func userPanned(panGestureRecognizer : UIPanGestureRecognizer) {
  // Handle pan translation
  let translation = panGestureRecognizer.translationInView(self.view)
}

Long press recognizer example

Long presses can only be detected on the select button since the api does not seem to allow to configure different press types as the UITapGestureRecognizer does. Hence, a UILongPressGestureRecognizer can be used to detect long presses on the select button, i.e. pressing and holding the trackpad button (or the centre button on the old remote), for example:
let longPressGestureRecognizer = UILongPressGestureRecognizer(target: self, action: "longPress:")
view.addGestureRecognizer(longPressGestureRecognizer)
  
func longPress(longPressGestureRecognizer : UILongPressGestureRecognizer) {
  // Handle long press
}

Other gesture recognizers

Even though they are described on the documentation I could not get the UIPinchGestureRecognizer and UIRotationGestureRecognizer, this may be due to the fact that they require multitouch, which is something that does not appear to be supported on the new Apple TV remote.

UIResponder events

Low level event handling is supported on the remote, meaning that UIResponder events can be used. The main caveat I have found is that there is no apparent concept of an absolute position of the user input on the trackpad. This means that regardless of wherever the user starts touching the trackpad, the original location will be the middle of the screen (960, 540). 
override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
  guard let firstTouch = touches.first else { return }
  let locationInView = firstTouch.locationInView(firstTouch.view)

  // This will always print x:960.0 y:540.0
  print("touchesBegan x:\(locationInView.x) y:\(locationInView.y)")   
}

Accelerometer

If you were one of lucky ones to receive a golden ticket from Willy Wonka then you'll be able to test the accelerometer on the Apple TV remote, otherwise you'll have to wait a few weeks.
The first thing to do is get the Apple TV remote controller. This is done by listening to the GCControllerDidConnectNotification, once connected a var can be set to keep a reference to this controller and finally a closure can be defined to listen to the accelerometer changes. From what I have investigated it looks like only the gravity and userAcceleration can be observed.
import UIKit
import GameController

class ViewController: UIViewController {
  
  var controller : GCController?
  
  override func viewDidLoad() {
    super.viewDidLoad()
    NSNotificationCenter.defaultCenter().addObserver(
      self,
      selector: "controllerDidConnect:",
      name: GCControllerDidConnectNotification,
      object: nil)
  }
  
  func controllerDidConnect(notification : NSNotification) {
    controller = GCController.controllers().first
    controller?.motion?.valueChangedHandler = { (motion : GCMotion) -> () in
      
      // Whatever you want to do with the gravity and userAcceleration
      
    }
  }
}
This means that even though attitude and rotationRate are properties of the GCMotion object returned by the closure they are documented not to work, from the GCMotion header documentation
/**
 @note Remotes can not determine a stable attitude so the values will be (0,0,0,1) at all times.
 */
public var attitude: GCQuaternion { get }
    
/**
 @note Remotes can not determine a stable rotation rate so the values will be (0,0,0) at all times.
 */
public var rotationRate: GCRotationRate { get }

Demo app

To try out the various user input I've posted a demo app on Github. The project is very simple, it has one view controller where the various inputs to the remote can be tested. Depending on the user input a label flashes red to have a visual feedback that that event has been handled. Note that the motion values will not be available on the simulator as a physical remote is needed for those.