// archives

App Store

This category contains 23 posts
Drowzee

Pokemon Go for Runners, Developers and Businesses

RealWorldPokemonWorldBack in the day, my son collected Pokemon cards, played Pokemon on Gameboy, and taught me about Pikachu, Snorlaxes, and other interesting creatures…as I’m sure the kids of many others my age did. As my son grew older, he gave his Pokemon card collection to someone much younger who had more enthusiasm (a very generous move, one he semi-regretted when he saw the prices for some of those cards on eBay!) and moved on to other things. Now in his mid-twenties, my son and I are playing Pokemon Go, semi-together from 200 miles away.

Despite the articles about “nerd herd” and getting the geeks out from behind their computers (which is a pretty good thing, IMHO), in addition to the afore mentioned family camaraderie (and I loudly applaud those friends of mine that are actively playing with their kids), there are other obvious reasons certain people should become familiar with this app/game:

DEVELOPERS

Pokemon Go is the top Free app (with in-app purchases) on the Apple App Store and Google Play Store in the US, the UK and multiple other countries, and has been there since it’s release. It is the fastest app to reach 10 million downloads worldwide, reaching that mark in seven days (source). It also currently leads all apps in daily usage time (i.e., how long do users actually have the app opened). (source).

It did have a bit of a head start in both content and database:

  • The game is built on top of Ingress (which is a game similar in play to Pokemon Go, but with a different story line), which is also a game put out by Niantic. From what I understand, all of the locations and landmarks in Pokemon Go originated from the database that Ingress uses.
  • The content head start is obvious – the previous cards and games provide not only for the 151 pokemon in the current game, but fodder for expansion in later games…and a knowledgable audience familiar with how the game might work.

PokemonGoAppAnniepng

There are some characteristics of the game that are familiar, especially to those familiar with previous pokemon games. But the basics are similar to anyone who has used any count/goal based program: collect everything and level up. This is a common development model, whether it is for a beer drinking app like Untappd (see my breakdown of the Untappd app here), a healthcare/shopping app like Walgreens or game apps. There are badges for most everything (similar to programs like Untappd) though I seem to rarely look at them, other than for counts.

PokemonGoARThere are some characteristics that are missing,

  • there isn’t any type of social sharing (like on Untapped where you can toast a friend’s beer check in, you cannot high five your friends when they get a rare pokemon).
  • a user cannot see their friend in game. Though this would be great for multi-player play, it would certainly complicate the program, and could enable a bit of stalking (if it were done without a type of permission).

These are holes that will be filled, either in future releases or by independent developers. There are already examples of an entire ecosystem springing up around the game; Chat apps (see this developer’s app blog) as an example, I assume to be used to tell people when a rare pokemon is near. There are also several hacks, such as maps that use the app protocols to determine locations of pokemon, pokestops, etc. (most of these can be found in the pokemondev sub on reddit). Some of these are getting shutdown; one even mentioned a “cease and desist” order.

The “augmented reality” piece, where you can use your device’s camera to see pokemon on the background of the real world, is interesting but unnecessary in this game. It is such a battery sucker that I do not know of any players that have not yet turned it off. It is being used primarily as a novelty (I found a pokemon at a landmark) or by businesses to lure pokemon hunters in.

ENTREPRENEURS and INVESTORS

The estimates of how much the game has made the various parties varies. One estimate says that Apple, purely on the percentage that they receive from in-app purchases through the app, will make $3 BILLION in revenue over the next couple of years (source). Since Apple gets 30% of in-app purchases, that would imply an estimate of $7 BILLION in revenue for Niantic (one would assume this gets shared with Nintendo for licensing).

There is, of course, no need to spend money in the game if you choose not to (full disclosure: I do not). Sensor Tower is estimating $1.6 million per day in the US spent. And the app has not yet been launched in Japan where the average spend per mobile user is higher, and the Pokemon craze is even more rabid.

Nintendo’s stock price doubled following the release of the app (chart here) though it has retreated a bit from those highs.

Local businesses are taking advantage as well. Yelp now lets users filter based on pokestop locations. Many shopping areas and downtowns will have multiple pokestops near them. In the game, there are items known as “Lures” which do what the name implies (they lure pokemon to a pokestop for 30 minutes). When this happens, the pokestop lights up on the map, shooting purple pieces up like flares. Small businesses near pokestops are dropping these lures to lure people in while they hunt.

ServerProblemsINFRASTRUCTURE and SYSTEM ADMINISTRATORS

Pokemon Go is almost as well-known these first few weeks for server crashes as it is for having more users than most other applications. Since Niantic spun out of Google, one would assume that they have Google infrastructure. They don’t have Amazon Web Services (AWS), as the Amazon CTO has humorously repeatedly offered health over twitter whenever the servers are down.

As the game added multiple countries over this past weekend (July 16), the servers supporting the game crashed repeatedly, causing the game to be in operable most of that Saturday morning.

The image on the right is all that the players see. There is no notice that the game is having server issues. So users either continue to press “retry” (which comes up after a few minutes of this screen) or kill the app and start over…both of which cause more login attempts and impact on the servers.

From a capacity planning standpoint, one would assume that there would be a trending analysis done on the initial implementation based in the United States before adding in the multiple additional countries. Either this was not done or it was done incorrectly, causing capacity to crash the servers.

This is tolerated somewhat humorously (check out the Pokemon Go reddit forums for examples) for now. But if there are tours, events and other plans made around the app ( as there were that Saturday), this will not be acceptable by the user community for long.

Interestingly as of this writing, Niantic is advertising for a Software Engineer – Server Infrastructure...probably a much needed position just now!

RUNNERS

My fellow joggers: we have an enormous advantage in this game of Pokemon Go. And this infuriates my son…and is the only reason I can even begin to keep up with him in this game (and with the many teenagers that are on summer break and do not have to work). That advantage is that mileage matters in several different facets of the program:

  • To hatch eggs, the player has to travel either 2K, 5K or 10K – depending on the type of egg. This distance cannot be travelled in a car (many have tried) so it certainly favors runners. During these summer months, I average 25-30 miles a week which builds up to a lot of hatched eggs.
  • When using incense (which I call perfume, much to the chagrin of my son), the player will see more pokemon if he moves at a faster rate. I’ve seen some tests where if you are stationary the player will only see a pokemon every five minutes with the incense, but when moving quickly the player sees one every minute. When you do this as a runner, I highly suggest that you make sure you have enough pokeballs.
  • If you have a Lucky Egg (which doubles your XP earning for 30 minutes) this could be a great combination with incense while running. I did this twice, for parts of two separate four mile run, averaging between a 9:00 and 9:45 pace in the lovely South Texas heat and humidity. In the 30 minutes the incense and Lucky Egg was active on the first instance I caught 21 Pokemon (missed 1) and gained 6000 XP. So…not quite one per minute, but not bad. On the second, I caught 25 and missed one, gained 9000 XP.

It may be obvious, but the downsides to running with the game are:

  • Pace is slower (at least mine is) due to distraction. I had been able to flick the pokeballs while running, but it only took running out of pokeballs once to stop that foolishness. Now some of those one-handed throws are acting like curveball throws, without me meaning to throw them. That may be related to the next problem.
  • Down here in Southeast Texas, sweat is a problem. When I run, it is usually 80 degrees and 70-80% humidity. Very irritating to try and throw a pokeball while running with sweat on your fingers. It can be done, but who needs those kind of challenges. And as I mentioned in the previous bullet, I’m seeing some unintentional curveball throws, which may be due to me sweating on the screen.

I have an old Google Glass from an earlier development project. Glass would be a great accessory for this game, and for all games that combine real-world with augmented reality. The ability to see landmarks and have heads-up display facts and stats was one of the benefits of Glass. Unfortunately, the issues it had, particularly with battery life, would have to be fixed. And it had a sweat problem (i.e., sweat be bad for Glass). But image just running along and speaking commands to Glass about throwing Pokeballs…those that make claims of “nerd herd” would have a field day with that one!

My current collection is below. Have fun!

Xcode capitalization problem

Xcode – Simulator vs. Device: CAPITALIZATION matters

There are, obviously and intuitively, differences between testing an iOS app on the Xcode simulator, and testing on a real device. The obvious ones run the gamut from no camera on the simulator to the way the keyboard works differently on both. The intuitive ones, in my mind, come from the fact that the Simulator is running on a different operating system (OSX) than the devices (iOS) that the app is intended for.

The difference that repeatedly bites me is: CAPITALIZATION matters.

The majority of the apps I do at JoSara MeDia are HTML5 apps in a framework called Baker. If you are interested, the rationale behind this is that most of the apps are either coming from books or eBooks (and hence are already in a format close to HTML, like ePub) or are heading in that direction (so we want to make conversion easy).

I was putting in a very nice jPlayer-based audio player (called jquery.mb.miniAudioPlayer, checkout the link, it is quite well done), and it looked great on the simulator, as you can see on the screenshots below. I tested it on several different simulator devices – all looked as expected, all performed the autoplay function, when expected.

Quebradillas Audio 2

In case you are interested, this is from a forthcoming “coffee table poetry book as an app” project called Quebradillas.

Quebradillas Audio 1

But, once I transferred the app to a device (either through cable or TestFlight) the audio player graphics did not make the transition (see screenshot below). And neither did the autoplay functionality.

Quebradillas on Device

 

Xcode capitalization problemThe culprit, of course, was two instance of capitalization. One was in the name of the included css file – in the head of the two pages, the “q” in “jquery” was lower case, and, as you can see from the Xcode screenshot, the file name itself was “jQuery.” This was acceptable in the simulator, which runs on OSX, but would not work (and, interestingly, did not pop up an error anywhere) on the devices tested (iOS). After looking at the javascript code in the jquery plugin, I could see that the “Vm” and “P” were icon placeholders…which lead me to the css file misspelling.

The autoplay issue was, again, capitalization: the parameter in one of the examples had autoplay in camelCase (i.e., autoPlay), but in the mb.miniAudioPlayer.js, the parameter was simply “autoplay.”

 

By noting this, I aim to remind my future self to use capitalization as one of the first items to check when apps look different in the simulator vs. on the device, especially when using HTML5 app frameworks.

AWS Elastic Transcoder

Using Apple’s HLS for video streaming in apps

Quebec CityOverview

All of the apps JoSara MeDia currently has in the Apple app store (except for the latest one) are self-contained; all of the media (video, audio, photos, maps, etc.) are embedded in the app. This means that if a user is on a plane or somewhere that they have no decent network connection that the apps will work fine, with no parts saying “you can only view this with an internet connection.”

This strategy works very well except for two main problems:

  • the apps are large, frequently over the size limit Apple designates as the maximum for downloading over cellular. I have no data that says this would limit downloads, but it seems obvious;
  • if we want to migrate apps to the AppleTV platform (and of course we do!), we have to have a much more cloud centric approach, as Apple TV has limited storage space.

These two issues prompted me to use the release of our Quebec City app as a testing ground for moving the videos included in the app (and the largest space consuming media in the app) into an on-demand “cloud” storage system. I determined the best solution for this is to use Apple’s HTTP Live Streaming (HLS) solution.

There are still many things I am figuring out about using HLS, and I would welcome comments on this strategy.

What is HLS and why would you use it

For most apps, there is no way to predict what bandwidth your users will have when they click on a video inside your app. And there is an “instant gratification” requirement (or near instant) that must be fulfilled when a user clicks on the play button.

Have you ever started a video, have it show as grainy or lower quality, and then get more defined as the video plays? This is an example of using HLS with variant playlists (other protocols do this as well).

Simply put, with HLS a video is segmented into several time segment files (denoted by the file extension .ts) which are included in a playlist file (denoted by the file extension .m3u8) which describes the video and the segments. The playlist is a human readable file that can be edited if needed (and I determined I needed to, see below).

Added on to this is a “variant playlist” which is a master playlist file in the same format that points to other playlist files. The concept behind the variant playlist is to have videos of multiple resolutions and sizes but with the same time segments (this should be prefaced with “I think”, and comments appreciated). When a video player starts playing a video described by a variant playlist, it starts with the lowest bandwidth utilization playlist (which is by definition smaller in size and therefore should download and start to play the quickest, thus satisfying that most human need, instant gratification), determines through a handshake what bandwidth and resolution the device playing the video can handle, and ratchets up to the best playlist in the variant playlist to continue playing. I am assuming (by observation) that it only will ratchet up to a higher resolution playlist at the time segment breaks (which is also why I think the segments all have to be the same length).

Options for building your videos and playlists

There are two links that provide standards for videos for Apple iOS devices and Apple TVs (links below):

These standards do overlap a bit, but, as you would expect, the Apple TV standards have higher resolution because of an always connected, higher bandwidth (minimum wifi) connectivity than one can expect with an iPhone or iPad.

To support iPhones, iPad and Apple TVs, the best strategy would be to have 3 or 4 streams:

  • a low bandwidth/low resolution stream for devices on cellular connections
  • a mid-range solution for iPhones and iPad on WiFi
  • a hi bandwidth/hi resolution stream for always connected devices like Apple TVs

Thus the steps become:

  1. Convert your native video streams into these multiple resolutions;
  2. Build segmented files with playlists of each resolution;
  3. Build a variant playlist that points to each of the single playlists;
  4. Deploy
  5. Make sure that when you deploy, the variant playlist and files are on a content distribution network which will get them cached around the world for faster deployment (only important if you are assuming worldwide customers…which you should).
  6. Put the video tags in your apps.

Converting your native video streams:

My videos are in several shapes and resolutions, since they come from what ever device I have on my at the time. They are usually from an Olympus TG-1 (which has been with me through the Grand Canyon, in Hawaii, in cenotes in the Yucatan and now in Quebec City) which is my indestructible image and video default, or some kind of iOS device. Both are set to shoot in the highest quality possible. This makes the native videos very large (and the apps that they are embedded in larger still).

There are several tools to convert the videos. These are the ones I’ve looked into:

  • Quicktime – the current version of Quicktime is mostly useless in these endeavors. But Quicktime 7 does have all of the settings required in the standards links from the first paragraph in this section. One could go through and set those exactly as specified, and get video output, then put them through the Apple mediasegmenter command line tool. If you do not have an Amazon Web Services (AWS) account, this would most likely be the way to proceed…as long as this version of Quicktime is supported.  To get to these options go to File -> Export, and select “Movie to MPEG-4″ and click on the “Options” button. All of the parameters that are suggested in the standards for iPhone and iPad are available here for tweaking and tuning. Quicktime 7 can still be downloaded from Apple at this link.Quicktime 7
  • Amazon Web Service (AWS) Elastic Transcoder – for those of us that are lazy like me, AWS offers “Elastic Transcoder” which provides present HLS options for 400K, 1M and 2M. This encodings are done through batch jobs, with detailed output selections. There are options for HLS3 and HLS4. HLS4 requires a split between audio and video. This may be a later standard, but I could not get the inputs and outputs correct…therefore, I went with the HLSv3 standards. The setup requires:
    • Setting up S3 repositories that hold the original videos, and destination repositories for the transcodes files and thumbnails (if that option is selected)
    • Creating a “Pipeline” that uses these S3 repositories
    • Setting up an Elastic Transcoder “job” (as an old programmer, I’m hoping this is a reflection on the batch job status of old!) in this pipeline, where you tell it what kind of transcodes that should come out of the job, and what kind of playlist.

AWS Elastic Transcoder

  • iMovie – iMovie has several output presets, but I did not find a way to easily adapt them to the settings in the standards.

Building segmented files

Once you have your videos converted, the next step is to build the segmented files from these videos plus the playlists that contains the metadata and location of the segmented files (these are the files that end with .ts). There may be other tools, but there are only two that I have found.

  • Apple dev tools – Apple’s HLS Tools require an Apple developer account. To find them, go to this Apple developer page, and scroll down to the “Download” section on the right hand side (requires developer login). The command to create streams from “non-live” videos is the mediafilesegmenter. To see the options, either use the man pages (type “man mediafilesegmenter” at the command line) or just type the command for a summary of the options. There are options for the time length of the segments, creation of a file for variant playlists, encryption options and others. I found through much trial and error that using the “file base” option (-f) to put the files in a particular folder, and omitting the “base URL” option (-b) (which I didn’t at first, not realizing that the variant playlist which points at the individual stream playlists can point to file folders to make it neat) worked best. In the end, I used this command to create 10 second segments of my original (not the encoded) files, to create a hi-resolution/hi bandwidth option.
  • Amazon Web Service (AWS) Elastic Transcoder – the Elastic Transcoder not only converts/encodes the videos, but will also build the segments (and the variant playlists). As you can see from the prior screenshot, there are system presets for HLSV3 and v4 (again, I used V3) for 400K, 1M and 2M. The job created in Elastic Transcoder will build the segmented files in designated folders with designated naming prefixes, all with the same time segment. I have, however, seen some variance of quality in using Elastic Transcoder…or at least a disagreement between the Apple HLS Tools validator and the directives given in the Elastic Transcoder jobs. More on that in the results and errors section.

Building variant playlists

Finally, you need to build a variant playlist, which is a playlist that points to all of the other playlists of different resolution/bandwidth option time segments.

  • Apple dev tools – the variantplaylistcreator cmd-line command will take output files from the mediafilesegmenter cmd-line command and build a variant playlist.
  • Amazon Web Service (AWS) Elastic Transcoder – are you detecting a pattern here? As part of the Elastic Transcoder jobs, the end result is to specify either an HLSv3 or HLSv4 variant playlist. I selected v3, as the v4 selection requires separate audio and video streams and I could never quite get those to work.
  • Manual – the playlist and variant playlist files are human readable and editable.

Currently, I am using a combination of Elastic Transcoder and manual editing. I take the variant playlist that comes out of Elastic Transcoder (which contains  400K, 1M and 2M playlists), then edit it to add the playlist I created using the mediafilesegmenter, the higher-res version. This gives a final variant playlist with four options that straddle the iOS device requirement list and the Apple TV requirement list.

Putting the videos into your apps

Most of my apps are HTML5 using a standard called HPUB. This is to take advantage of multiple platforms, as HPUB files can be converted with a bit of work to ePub files for enhanced eBooks.

To use the videos in HTML5 is straightforward – just use the <video> tag and put the variant playlist file in the “src=” parameter.

Results and Errors

In the end, the videos work, and seem to work for users around the world, with low or high bandwidth, as expected. I’m sure there are things that can be done to make them better.

I’ve used the mediastreamvalidator command from the Apple Developer tools pretty extensively. It doesn’t like some of the things about the AWS Elastic Transcoder generated files, but it is valuable in pointing out others.

Here are some changes I’ve made based on the validator, and other feedback:

Screen Shot 2016-02-18 at 8.51.03 PMScreen Shot 2016-02-18 at 8.51.21 PM
Error: Illegal MIME type -
this one took me a bit. The m3u8 files generated by AWS are fine, but files such as those generated from the mediastreamsegmenter tool do not pass this check. They get tagged by the error–> Detail:  MIME type: application/octet-stream”. In AWS S3 there is a drop down list of MIME types in the “Metadata” section, but none of the recommended Apple MIME types are there. The files generated by AWS have the MIME type “application/x-mpegURL”, which is one of the recommended ones. Since it is not a selection in the drop down, it took me a while to determine that you can actually just manually enter the MIME type into the field, even if it is not in the drop down list. Doh!

Time segment issues – whether utilizing AWS Elastic Transcoder or the mediafilesegmenter cmd line tool, I’ve always used 10 second segments. Unfortunately, either Elastic Transcoder isn’t exact or the mediastreamvalidator tool does not agree with Transcoder’s output. Here’s an example as a snip from mediastreamvalidator’s output:

Error: Different target durations detected

–> Detail:  Target duration: 10 vs Target duration: 13

–> Source:  BikeRide/BikeRideHI.m3u8

–> Compare: BikeRide/hls_1m.m3u8

–> Detail:  Target duration: 10 vs Target duration: 13

–> Source:  BikeRide/BikeRideHI.m3u8

–> Compare: BikeRide/hls_400k.m3u8

–> Detail:  Target duration: 10 vs Target duration: 13

–> Source:  BikeRide/BikeRideHI.m3u8

–> Compare: BikeRide/hls_2m.m3u8

This is basically saying the the “HI” version of the playlist (which was created using Apple’s mediafilesegmenter cmd-line tool) is ten seconds, but the AWS Elastic Transcoder created playlists (the three that start with “hls”) are 13…when the job to create them was set for 10 seconds. I am still trying to figure this one out, so any pointers would be appreciated.

File permissions – when hosting the playlist and time segment files on an AWS S3 bucket, uploading the files causes the permissions for the files always need to be reset to be readable (either “Public” or set up correctly for a secure file stream. This  seems obvious, but working through the issues that the validator brought up had me uploading files multiple times, and this always came back to bite me as an error in the validator.

HLS v3 vs. v4 – except for the fact that you have to have separate audio and video streams in v4, I’m still clueless as to when and why you would use one version over the other. It would seem that a single audio stream would be needed for really really low bandwidth. But separating out the video and audio streams is quite a bit of extra work (I would be thrilled if someone would leave a comment about a simple tool to do this). I can see some advantage in separate steams, in that it would allow the client to choose a better video stream with lower quality audio based on its own configuration. More to learn here for sure.

Client usage unknowns – now that the videos work, how do I know which variant is being played? It would be good to know if all four variants were being utilized, and under what circumstances they are being consumed (particular devices? bandwidths?). There is some tracking on AWS which I can potentially use to determine this.

I hope this helps anyone else working their way through using Apple’s HTTP Live Streaming. Any and all comments appreciated. Thanks to “tidbits” from Apple and the Apple forums for his assistance as I work my way through this.

Download on the App StoreTo see the app that this is used on, click on the App Store logo. I’d appreciate feedback especially from those not in the US as to their perceptions of (a)how long it takes the videos to start and (b)how long it take the quality to ramp up.

Screen Shot 2015-02-10 at 3.50.11 PM

TestFlightApp.com goes away February 26, 2015 – use new iOS 8 Test Flight App

IMG_1295The website we have utilized for beta testing of apps, TestFlightApp.com, shuts down on February 26th, 2015. All app testing will be moved to the iOS8 TestFlight app and managed through Apple’s iTunes Connect.

As a developer, and a user there are many more PROs to this than CONs. The new TestFlight app for iOS8 radically simplifies the process of beta testing apps.

In the old TestFlight.com, a developer had to:

  • invite users to TestFlight
  • get the user to register a device on TestFlight
  • take the UUID of the device and register it on developer.apple.com
  • put the device ID into a provisioning profile
  • use that profile to build
  • re-upload the build (or the new provisioning profile if that was all that changed) to TestFlightApp.com
  • distribute the build to designated TestFlightApp users

There were several places where that process could get stuck and could indeed go wrong.

With the new TestFlight iOS8 app, the steps are much simpler:

  • developer submits app (there are several steps involved here for developers, but they are basically the same as submitting an app to the app store. After Archiving, you “Submit” the app to a version on iTunes Connect that is marked “Prepare for Submission”
  • check the box for “TestFlightBetaTesting”Screen Shot 2015-02-10 at 3.50.11 PM
  • select “External Testers” (this will not be visible until the app goes through Beta App Review)
  • invite the test user via email (does not even have to be their iTunes email, as Apple will do the mapping)
  • the users will be asked to download the TestFlight app (if they have not already done so)
  • the app will then be available for install from the TestFlight appIMG_0128
  • apps installed via TestFlight app will have orange dots beside them
  • users will be notified when new versions are availableIMG_1296

The device ID mapping is done by Apple. No changes in the provisioning profile are needed.

What else is different? Here are the cons:

  • No Android. Obviously, since Apple acquired them, support for this has been dropped. It was an advantage to have all testers and testing in one web site.
  • Beta builds only available for 30 days. After that time, the developer must submit another build.
  • No TestFlight SDK. The developer could including TestFlight’s SDK and get more data on what parts of the app the users was testing. These feature has not yet been moved over to the new TestFlight app (if anyone has found it, please let us know).
  • Wait for Beta App Review. Apps do have to be submitted for “Beta App Review”. The first time this is done, it can take a few hours to a couple of days. After that, it is quite quick, if the developer answers a question concerning the level of changes in this build (fewer changes do not apparently require an extensive review).
  • Issues with Gmail invites. We’ve run across one issue with invites receive in Gmail that were not able to acknowledge the TestFlight invitation and thus allow the app to be run under the TestFlight app.
  • Works only with iOS8. This is not as big an issue as it was. We assume Apple waiting until the adoption rate was high enough before discontinuing the Test Flight app web service.
  • If this is an upgrade to an existing app store app by the same name, the Test Flight app will write over it. The user will be notified of this with a alert notification. The user can always get the production version back via the app store.

Overall, the PROs far outweigh the CONs, and hopefully some of the other pieces will show up in the future.

Existing users can be exported from TestFlightApp.com into CSV files for import as external users on Apple’s iTunesConnect web site (where user management is now controlled). Detailed instructions here.

MobileprovisionProfile Screenshot

What time does an Apple provisioning profile expire?

For those of you that do not get the challenges of living in the Apple Developer world, a bit of background: To deploy an iOS app outside of the Apple App Store, either as a “beta” with an Ad-Hoc Distribution profile, or as an Enterprise with an Apple Enterprise Developer account, a Apple Provisioning Profile is required. This profile is built on Apple’s Developer Web site and requires a developer certificate (“trust” the developer!), a list of devices (up to 100) or the domain of the Enterprise (depending on whether this is for Ad-Hoc or Enterprise Distribution), and an app ID. This information is used to generate the provisioning profile, which is distributed along with the app to identify which devices are allowed to utilize the app.

For reasons only known to Apple, provisioning profiles, even Enterprise Provisioning profiles, expire once a year. Perhaps this is Apple’s way of ensuring that Enterprise’s keep up with their $299 annual fee to keep their Enterprise Developer License.

Recently, Apple extended the time validity of the certificates generated under the Enterprise Developer License to three years. But the profiles all still expire after one year.

This has spawned a huge marketplace for MDM (Mobile Device Management) solutions, used to help deploy (or redeploy, in the case of an expired profile) apps in an Enterprise.

It is easy to see what day a profile expires (it is visible on the device under Settings/General/Profiles, generates pop-up warning on the device, and it visible in your Apple Developer account page). But, because of a last minute customer call, we needed to know when would it really expire. This customer did not have an MDM solution, and, though we had built into our app and forced upgrade functionality, if the profile expires, the app stops working.

This is obviously a major issue with Enterprises deploying Apple apps internally. When given enough warning, it can be handled, even without an MDM.

Apple Provisioning ProfileBut, given less than 24 hours notice, what we really needed to know was not only the data, but the time the profile would expire.

When you build a profile, you need to get it into XCode (the Apple IDE) to use it. This can be done from XCode, or you can download the profile as a file, then double click on it and open it in XCode.

In other words, the profile (shown to the right) is just a file. It is in the format of a “plist file”, a properties list file.

Since we were trying to determine what time the profile expired, and we could not find that information anywhere, we decided to look into the file. We opened it with a simple text editor (right click, select “Open With” and select your favorite editor.

Most of it was quite easy to read, as plist files are XML. You can tell it is a plist file as it starts with this information:

<?xml version=”1.0″ encoding=”UTF-8″?>
<!DOCTYPE plist PUBLIC “-//Apple//DTD PLIST 1.0//EN” “http://www.apple.com/DTDs/PropertyList-1.0.dtd”>
<plist version=”1.0″>
<dict>

The file contains the developer certificate, which is a long character string that looks like garbage. But after that data string, there is more information, including the follow nugget we had been looking for:

<key>ExpirationDate</key>
<date>2014-06-11T21:50:35Z</date>

Not only the date, but the time (in GMT, or Zulu time).

Hope this helps anyone else who has the need to look for the same information. Obviously, the best practice is to avoid waiting until the last minute. But if you do, it is good to know how much time you have.

Untappd on AppAnnie

The Psychology of UnTappd

If you haven’t heard of or used the app known as Untappd, you may not be a beer drinker. But the rise in Craft Beer brewing and drinking in America (see a good infographic here) has pushed many Americans out of their lager drinking malaise and into enjoying the multitude of tastes that are presented by the craft beer industry.

If they treated introductions like a MLM scheme, I’d own part of UnTappd by now :). And that is the beauty of the mixture of social media, location, goals/badges and history/statistics that UnTappd succeeds at: it is an app that you want to share, and after sharing, you encourage your friends to use. That psychology is what all social media type apps should strive for.

The app is simple: you have a beer, you log that beer in UnTappd. If you desire, you can include a ranking (one to five bottle caps), a location (from Foursquare’s massive location database), a picture (which, like all social media photo sharing can come back to haunt you) and a comment.

This app is free, and that, plus making it available on as many platforms as possible, is a genius move by the developers.  By simply giving users the means to track their beers, they are building a huge data warehouse of likes, dislikes and drinking characteristics (when, where, what type, with whom) that any brewery or pub/bar would be mad not to take advantage of. Breweries can register to manage their brand on the site here.

This is the opposite approach of the app which is leading in profits in May 2013 on the iOS App Store, Candy Crush Saga. Candy Crush is also free but makes revenue based on in-app purchases that help get through levels users are stuck on (note that one does not have to purchase anything to get through the levels, a user just needs patience…the fact that Candy Crush Saga is leading in revenues is a clear indicator that US app users want instant gratification and are willing to pay for it).

The appeal and staying power of the app is revealed in this chart from App Annie: since its release in October 2011, the app has stayed in the top 250 for Free iOS App Downloads and usually in the top 100. (chart after the break) (more…)

App_Store_Badge_EN_0609

Grand Canyon app in Apple App Store

As a lover of books and technology, I’ve spent a lot of time the past few years investigating how to combine them. The proliferation of tablet computing, and the need/availability for interaction, have pushed us to a point where a book can be more. Terms like enhanced eBook, interactive eBook and others have been bandied about; but whatever the term, adding multimedia to a print book turns it into something more.
App_Store_Badge_EN_0609
We also recently have been working with non-profit organizations, such as my friends at the Texas State Historical Assoication, helping them to take their unique and valuable content (most of it in print format, or even out-of-print) and get it into a digitized, interactive medium…into a format that will continue to promote their goals of education, research, preservation and membership.

I stumbled across the work of some fine people utilizing HTML5 to build enhanced eBooks (the Baker Framework, and the Laker Compendium. With the current ePub standard, there is no standard support for adding multimedia; Amazon’s Kindle format provides some, but only on certain platforms.

With these converging trends, technologies and paths, I’ve put together an app for the Apple App Store that is an experiment of sorts; a proof point, if you will, that not only can you build an entertaining enhanced eBook, but that utilizing available content that you can use this content as a bridge to sustainable funding for non-profits.

That available content happened when my brother took me on a journey through the Grand Canyon, with some great guys. A once in a lifetime trip – hiking, rafting, and experiencing one of the natural wonders of the world.

With that introduction, I am happy to announce:

Cecil does the GRAND CANYON holding a poptart
For iPad and iPhone/iPod Touch

If you want to reflect back on a trip you made to the Grand Canyon, one of the eight natural wonders of the world, or you just want to imagine one, this app will take you there.SittingOnARock

With proceeds benefiting the Grand Canyon Association, this Grand Canyon app follows the author, friends and guides as they:

  • hike down Bright Angel Trail;
  • raft one hundred miles down the Colorado River;
  • hike the so-called “Death March” hike to Thunder River and Surprise Valley;
  • visit Havasu;
  • brave Lava Falls (and live to tell about it);
  • helicopter out from Whitemore Wash.

Containing hi-definition videos, hundreds of photos, maps and the story of the journey, this multimedia application will be sure to remind you of your own trip to the Grand Canyon…or increase your desire to visit.

amznmp3_biggergif

Amazon MP3s on Twitter

While we can argue whether Twitter is a “new social media” or not, one great amznmp3_biggergifthing about it is the Amazon MP3 twitter feed.

Amazon certainly seems to be trying to put a dent into iTunes dominance. Not only are they offering DRM free MP3′s, but their Amazon downloader puts the songs right in your iTunes (if you so choose).

You do not have to be a Twit or even a Twitter user to see the daily deals. Just go to http://twitter.com/amazonmp3 each day.

The daily deals are extraordinary. This month alone I have downloaded:

  • on 3/3 (the day the album came out) U2′s No Line on the Horizon for $3.99;
  • yesterday, John Coltrane’s The Ultimate Blue Train for $1.99;
  • today, Diana Ross & the Supremes’ 18-song Definitive Collection. Everyday price: $7.99 Today’s price: $1.99

Yes, an eclectic mix, but they feature different albums everyday. I’m going to ignore the feed for a few days before I personally re-stumulate the economy.

David Byrne and Brian Eno together again

The Talking Head’s live album “Stop Making Sense” was the staple at our Rugby games, the tunes that got us jazzed up to play. The movie of “Stop Making Sense” (cue the visual of Byrne dancing in his over sized suit) really pounded that album into my brain.

Byrne has re-united with Brian Eno who produced several Talking Head’s albums for a new album called Everything that Happens will Happen Today. The streamer for it is embedded below. Good to hear you again, Mr. Byrne, and please bring back the suit.

Like Elton

tunerev: The Red Piano by Elton John

Last year, it was a trip with our friends Monte and Margaret (from Munich) to Oxford,Like Elton England to see Dave Matthews and Tim Reynolds…and a fantastic trip it was.

This year, it was Vegas and Elton John at the Coliseum at Cesar’s Palace. A shorter trip for my wife and I, but a longer trip for our traveling companions….but, again, well worth the air miles. This wasn’t just a concert, but a multi-media event. Elton and his band were fronting a huge screen, which showed a video or photographic collage for each piece played. David LaChappelle, a somewhat surrealistic photographer and film director, designed the set and the movies. (more…)

Re-reading MSandT

Re-reading Tad William's Memory, Sorrow and Thorn

click on the image for more info and to support this blog

Dusk Before the Dawn

Dusk Before the Dawn

Software By the Kilo

Software by the Kilo

Archives

%d bloggers like this: