// archives

Technology

This category contains 22 posts
International Space Station

Viewing the International Space Station

We saw the International Space Stations (ISS) as we drove from Houston to Austin Monday evening (10/16/17) at about 7:50pm looking at the NW horizon. At first we saw lots of planes and stars thinking it might be the ISS, but once it came shooting across the sky it was easy to see.

International Space StationWe heard about it on the news, but found the directions looking at SpotTheStation, a NASA website that will not only tell you when the ISS is coming into view in your area and alert you via email, but if you scroll down the page it will tell you what the directions for viewing mean (see image to the right).

Watching the ISS go over reminded me of another website I had found: WhoIsInSpaceRightNow. As the ISS was going overhead, the number was 6…as always, an amazingly small number compared to the 7.4 billion people on the earth.

And the total number of people who have ever been in space currently stands at 536 according to this Wiki article.

Both current and all time space travelers remain an exclusive club…until the bus starts loading for Mars!

ReviewsTabCurrentVersioniOS9

Steps to Rate and Review Apps on the Apple App Store

An App’s visibility on the Apple App Store is enhanced by its ratings and reviews. We’ve received many great ratings and reviews for our JoSara MeDia apps that are in the Apple App Store. And the steps to provide App Store reviews and ratings have changed slightly between the current iOS10 and the forthcoming iOS11 (now in preview).

What some users are not aware of is that the rating visibility of an App is reset with each new version. Even a minor change to an app will cause the ratings to reset. For example, we made a minor change to our Grand Canyon app to make the videos fit better when the iPhone 7 and 7+ were released. Even though the Grand Canyon app has only five star ratings (13 at last count) those ratings are not visible on the new version unless a user explicitly selects to look for reviews and ratings for “All Versions.”

There are several ways to ensure re-ratings, such as in-app pop-ups (Update: this will change with iOS11 as Apple will require developers to use its own in-app rating API, which will limit the number of times a user is prompted).. But users can also go into the App Store from their iPhone or iPad and easily rate apps.

The main differences between App Store reviews and ratings in iOS10 and the iOS11 preview are:

  • the app developer will be given a method (most likely in iTuneConnect) to select whether they would like ratings to carry across updates (hat tip to Reddit user emlombardo for pointing this out)
  • to rate an app on iOS11 can be done from the app’s page in the app store without going into the “Write a Review” section. This should provide for more ratings, albeit without reviews.

Here are the steps. Screenshots are below.

  1. Tap on the App Store Icon
  2. Tap in Search
  3. Type in “josara” to search for JoSara MeDia apps (or any other app name or app developer if you want to rate their apps)
  4. Scroll to the app you want to rate and tap on that apps name
  5. For iOS 10 and earlier versions:
    1. tap on the “Reviews” tab (between “Details” and “Related)
    2. tap on “Write a Review”
    3. tap on a star rating (5 stars is best, 1 star is worst)
    4. if you like, enter a title and a review of the app
  6. For iOS 11 (as of the current beta)
    1. scroll down to the “Ratings and Reviews” section
    2. tap on a star rating
    3. if you like tap on “Write a Review”

Click on the screenshots for App Store reviews and ratings in the table below to see larger versions.

iOS10 and earlier
iOS11
Locate App Store
Locate Search icon
Enter Search Terms
View Search Results
Select App to Rate and/or Review
For iOS10, tap on "Reviews" header.

For iOS11, scroll down to "Reviews" section
For iOS 10, tap on the "All Versions" tab to see ratings and reviews for all versions.

For iOS11, as of the current preview release, I can find no such option
To Rate an App:

For iOS 10, tap on "Write a Review." Star ratings are inside the "Write a Review" section. Tap a star, then tap send. Reviews are optional.

For iOS11, simply tap on the star rating.
To Review an App, tap on the "Write a Review" label in both versions

RWDevCon Mug

RWDevCon 2017 – Notes and Thoughts

RWDevCon 2017I attended the third RWDevCon March 30 – April 1 in Alexandria, VA. (Yes I know that was 10 days ago, but I had a lot of notes.) It was the third time RWDevCon was held, thus I was one of a reported 12 faithful attendees who have attended all three conferences. I even dragged a colleague from one of my customers to attend this year.

Why do I return for the joy of back to back eight hour days of seminars (this year with a third all day workshop for added pleasure)?

  • Hands-on Tutorials. Before each conference, Ray Wenderlich (the “RW” in the DevCon) and his team send out the slides that will be covered in the tutorials (a small PDF) along with a Zip that includes PDFs of slides and Xcode of all of the demos and labs (with multiple start and stop points) that will be covered as well…and this Zip is huge. This size comparison drives home the point: this is the most hands-on conference of any developer conference I’ve been to. I was prepared this time, had all of these files loaded in Notability on my iPad and was taking notes on the PDFs (aside: I was a skeptic about the Apple Pencil, but this note-taking setup has converted me). In the previous two years, there were some pacing issues, where there was just way too much to type in the demos, and that got in the way of listening to the instructor. When the instructors type while they talk, and expect you to listen while you talk, laptops get closed and people start to just listen…which is okay, but not optimal. This year, I only experienced that one time. Ray and his team, with all of the practice they put into these sessions, have it down.
  • RWDevConConference organizers who listen. I (and every other attendee) received an email from Ray asking what types of sessions we’d like to see in the conference. I do not recall all of my responses, but I did list “error handling”, “unit testing”, “machine learning” and “application architecture”. All of these appeared on the agenda. Even a small item, like a request to have something other than those nasty, incredibly-bad-for-you Coke products during the breaks was listened to…with Naked Juice now as an option. When they listen to their attendees, good things happen.
  • Ray and Vicki. As I was walking into the workshop the first day, I ran into Ray, looking very bleary eyed. Though I knew he’d probably been up all night reviewing sessions and entertaining the vast horde of Brits he seems to employ (!), he stopped for a quick chat and update. Vicki walks through with her ever present smile, knowing everyone’s name. Though I only see these two once a year, they are very enjoyable people and I revel in their success. They make no excuses about being introverts, but practice their presentations over and over until they seem comfortable in the spotlight. I hope RWDevCon grows to dominate the world, just so they can enjoy the fruits of their labor.RWDevCon
  • The RWDevCon team. They work hard, they are there to help you learn, and they are a lot of fun. And Brian lets me win at the card games (sometimes). Ray and Vicki, at every conference so far, talk about inclusion, about letting people feel not only comfortable but like they belong. They walk the walk.
  • Beer. Ray and Vickie call this friendship, camaraderie…but let’s face it – we’re here for the beer (and the card games. and the people). I must ding Mr. Wenderlich, though for two beer notes:
    • At the bash at the Carlyle Club, Ray was telling me how wonderful the beer was that he was drinking; when he left to go do host-type things after setting his bottle on the bar, I asked the bartender for one…only to find out that Ray had just quaffed the last one (but left an inch in the bottle…I did NOT drink after Ray). I do not remember what beer this was as I have blotted it from my mind.
    • For the receptions at the hotel, the catered food and mojitos were quite good. But when Sam Adams is the best beer of the selection…time to head down to the Trademark for their excellent selection of beers….or to the post-conference bottle share…but that is a later session.

It would be great to see RWDevCon grow into something much larger, so that those that put in the hard work could realize that success. But it is also excellent at the size that it currently is. Ray, Vickie and the team have a hard balancing act to do.

Bottom line: I cannot recommend this conference more highly to any and all iOS developers. Most of these sessions provided me with enough information that I could use what I learned immediately. Some are difficult enough that I’ll need to review the data before knowing enough to be dangerous. But the amount of tools and education gained in these three days provides a high ROI on time and money spent.

All day workshop – App Architecture (Josh Berlin and René Cacheaux)

FreedomIsntFreeIPAIt was a tough choice between the Debug workshop versus this one, but need won out. I’d recently completed a cycle count and inventory Swift application in a very tight timeline, and I KNEW I had broken many app architecture rules in haste….so I went to task to re-learn and hopefully be amazed by some new ideas as well. Had I known attendees would be getting a copy of Derek’s (who was leading the other workshop) Advanced Apple Debugging and Reverse Engineering book at the conference close, I would have had more incentive to choose this session…until Josh and René put out an app architecture book.

After meeting one of my three favorite nephews for brews and dinner at the Trademark downstairs the night before, I was primed and ready (obligatory local beer picture included) for the next morning’s 8 hours session.

What I learned:

  • Josh and René are both from Austin, where I’ll be living soon. Beers will be shared.
  • From the intro: “Thoughtful design of the boundaries between your apps subsystems is the foundation of a stable codebase.” Deep stuff for 9am!
  • Dependency Injection (the demos used the Swinject library, there are others available). The first demo pulled an API call out of the view controller and put it into a Swinject Assembly….need to use this as we refactor our cycle count app.
  • Storyboard Dependency Injection.
  • Use case driven development. This is not a new subject, a quick google of that phrase shows many old articles on university web sites, IBM, ACM, etc. But the workshop (specifically Demo2) showed swift version of implementing use case objects.
  • Unidirectional data flow with a state store.
  • Redux – state stored outside of the view controller
  • Some RxSwift as an appetizer to the session tomorrow (this actually had me change my schedule to attend the RxSwift session instead of the Fastlane session).

After the workshop, with a full brain, sore butt from sitting, an hour to spare until the opening reception and the threat of bad running weather in the ensuing days, I headed out to run down King Street, a very cool old set of blocks that runs towards the river, right into more running trails (obligatory running scenery picture included – session details after the photo).

AlexandriaRun

Friday sessions

There were three tracks of sessions. There are the ones I selected to attend. There are several others that I either worked through the demo material or plan on doing so…as, so far, one cannot be in two places at once.

Machine Learning on iOS (Alexis Gallagher)

Why I attended: I want to employ machine learning with two of our products (Secure Workflow and Clinical Decision Support).

What I learned:

  • Docker can actually be run on my wimpy old MBA, making me glad I didn’t upgrade (but of course, when the chipset gets refreshed, I’ll be the first in line).
  • The setup was well done, starting the training part of the machine learning demo first (so it would actually complete on a wimpy old MBA) and then going through the theory.
  • The docker image had Google Tensor Flow, and we used the inception3 algorithm.
  • Alexis has some great smiles…and some rather frightening frowns.
  • We made training data by pulling images out of videos, classifying those videos as smiling and frowning, and then letting the Docker image train on the pulled out images against the baseline.
  • The tensorFlow console is accessible in the Docker, and allows you to browse deeply through the Inceptionv3 network.
  • I spoke with Alexis after the session, and discussed with him how I wanted to employ machine learning. Based on my description, he suggested linear regression and pointed me towards the Machine Learning course on Coursera taught by Andrew Ng of Stanford University   (which he also mentioned at the top of his “Where to go from here?” slide and the end of the session). A session of it just started, and it was a great suggestion (thanks, Alexis).

iOS Concurrency (Audrey Tam)

Why I attended: We have an old objective-C app that is getting converted to Swift, and it needs some concurrency help at one customer as their network is slower. I’d like to put parts of the data refresh in the background while updating the animations of the workflow tasks and notes. Plus Audrey is my wife’s name, so there you go.

What I learned:

  •  I have a lot of work to do.
  • What not to do (deadlocks, race conditions, priority inversions)…same concurrency problems that exist in all programing languages
  • Dispatch groups, dispatch barrier tasks
  • Async operations

Building Reusable Frameworks (Eric Cerney)

Why I attended: The old objective-C app I mentioned earlier has frameworks, and they need to be updated.

What I learned:

  • “A library is a tool, but a framework is a way of life.” Let’s do t-shirts!
  • Explanation of some of the differences between the three main dependency managers (Swift Package Manager, Carthage, CocoaPods)
  • SPM still doesn’t support iOS
  • using ‘git tag’ to manage versions
  • Demo 1 walked through Swift Package Manager; Demo 2 walked through building a multi-platform library (which, as I’ve heard, is a tool, not a framework…)
  • great demo on access control, and a good list of “gotchas”  on bundle specifications, global values, and singletons

RWDevCon RxSwiftRxSwift in Practice (Marin Todorov)

Why I attended: Reactive programming. Buzz words. Got a taste during the app architecture workshop. And I sensed a book was coming (obligatory signed book page picture included, only found Marin and Ash though…there’s a lot of authors!)

What I learned:

  • Don’t try to use reactive programming for everything (also mentioned in one of the inspiration talks)
  • Asynchronous code and observables. Demo 1 walked through “Variable” and emitting and subscribing to events
  • Using Observables with “bindTo” to tie incoming JSON directly to a tableView. This I will definitely use as we update our workflow app.
  • Using “bag = DisposeBag()” to get rid of subscriptions…takin’ out the trash!
  • I’ll walk through the book for more detail. We should be able to use this in the tableViews that are in our cycle count and inventory app, which shows which aisles still have items to be counted since this is made available via RESTful web service, and currently gets updated in a non-reactive way if the current user assigns themselves an aisle to count.

Saturday sessions

Practical Unit Testing I (Jack Wu)

Why I attended: Code is never tested enough. Especially mine.

What I learned:

  • When Jack says “practical”, Jack means practical. Lots of good points about balancing the time it takes to write and maintain tests versus having “good” and “useful” tests. Another way to read this: “Jack hates testing code as much as I do, so let’s do it efficiently and quickly and no one will get hurt.”
  • Write tests before you refactor, make sure the tests succeed, then refactor.
  • How to write a basic unit test
  • How to write a UI Test in Xcode, and how to make them not so darn slow
  • You can refactor you code to be more testable, and this makes for easier to understand code. My lead developer is a refactoring machine, and his code is always testable….make sense.
  • I started in the session writing tests for the next version of the cycle count and inventory app. This was a very practical and applicable session.

Swift Playgrounds in Depth (Jawwad Ahmad)

Why I attended: I didn’t get to use playgrounds enough as a kid. It was a tough choice between this one and Practical Unit Testing II.

What I learned:

  • Playgrounds are still flaky. Several folks had to restart Xcode (myself included) to get the Live View to work.
  • IndefiniteExecution in a playground….cool
  • Reading from and writing to a file in a playground…quite useful
  • moving code into a framework to use in a playground

Advanced iOS Design Patterns (Joshua Greene)

Why I attended: The description talks about authentication, auto re-login, data and thread safety designs.

What I learned:

  • This was one session where I could not keep up with all of the typing, and for the most part sat back and listened.
  • Demo 1 walked through MutlicastClosureDelegate; Demo 2 walked through the Visitor pattern.
  • During the lab time, I actually started going through the “On-boarding” seminar slides and demo/labs. I’ll be running back through both of these sessions again. The on-boarding piece is quite useful for first time training users, even (or especially) for Enterprise apps.

Swift Error Handling (Mike Katz)

Why I attended: My error handling looks like the if-then-else statement from hell.

What I learned:

  • I’m going to “borrow” all of Mike’s error handling routines.
  • throws, try (not the rugby kind of try I’m used to!), do/catch
  • pass errors up (from inside to the call)
  • RETHROWS!
  • Using a Result-type and map
  • The Lab went through ways to provide error responses to AlamoFire and also a way to do auto-retries after timeout errors

RWDevConClosing session

  • The RW team pushes the session and conference evaluations hard, because they compile them on the fly. And, in this closing session immediately after the last inspiration talk, Ray details a summary post-mortem and asks for more feedback. This is the only conference that I can recall that does this.
  • One way they push the evaluations is they give out prizes (you get an entry ticket for each evaluation). And, for the third year in the row, I won…nothing.
  • But I did get two books, both of which have been previously mentioned (RxSwift and Advanced Apple Debugging). And managed to get both of them signed (obligatory signed book pictures)

Post-conference bottle share

Why I attended: I brought two great beers from local Houston breweries and they needed to be tasted. (obligatory Houston beer picture included, especially since several people asked about them. They were mighty tasty)

What I learned:

One could make a case that this wasn’t really part of the conference. But it was. We traded beer stories, travel stories, family stories, tried to kill a monster in a dungeon while bluffing (more card games), and generally had a great time. All were invited.

RWDevConConclusion

With the amount of pre-conference setup, conference materials and notes gathered from this (and the previous two) RWDevCon, the investment here will continue to pay off as they are used and referenced. Next comes incorporating these into release plans for the apps we already have deployed, and those we will deploy in the future.

Some additional photos included here at the end.

RayClosing

MarinKeynote

Turtonator

BottleShare

SXSW 2017

SXSW2017 Health Technology sessions

Notes on the 2017 SXSW Health Tech sessions I attended (some with photos, some with photos of slides from the presenters) in order of relevancy to current projects. The sessions (and links to each if you want to jump down) are:

Screening for Heart Disease with the Apple Watch

No More Apps – Why Reinventing Devices is Key

Diabetes Avalanche

Dunking on Disparity: Health Tech for All

Health Data: WFT? We’ve been to the Moon, But…!

To Build in Health, Follow the $ Not the Patient

To see notes from other SXSW2017 sessions:

Bruce Sterling

Equity Crowdfunding

Screening for Heart Disease with the Apple Watch

AppleWatchHeartRateGents
Presenters: Dr. Ray Duncan, Dr. Joshua Pevnick (both from Cedars-Sinai Health System)

This was an excellently balanced and informative presentation where Dr. Duncan presented the technical perspective and Dr. Pevnick presented the data analytics and research perspective. I took pictures of most of their slides, the pertinent ones are included here.

Cedars Sinai starting allowing patients the option through their patient portal (voluntary)  to link wearable devices and their readings, and integrated those readings into their EPIC EMR system. With little advertising they got up to 2800 patients (out of 130,000 portal users) sending in readings.

SXSW Health Tech Cedars Sinai

 

  • EPIC integration for wearables (for some) comes out of the box
  • Early data was from younger, healthier patients (who are the target early adopters of this technology)
  • Due to amount of data, visualization is key (and I would assume, some machine learning for pattern matching would be great with this data)
  • Some data is erratic – is it device error or normal variant or pathology?
  • Dr. Duncan and Dr. Pevnick’s slides and presentation were both excellent. I’ve inserted a few of them below.

SXSW2017 Health Tech Cedars Sinai

SXSW2017 Cedars Sinai

SXSW2017 Health Tech

SXSW2017 Health Tech Cedars Sinai

SXSW2017 Health Tech Cedars Sinai

SXSW2017 Health Tech

SXSW2017 Health Tech

SXSW2017 Health Tech

SXSW2017 Health TechNo More Apps – Why Reinventing Devices is Key

An interesting session title, especially given that two of the panelists with devices also had apps that were critical to their devices. The incongruity was somewhat rectified by the discussion that the focus was on the device, as opposed to YAAS (Yet Another App Syndrome, my acronym).

Panel: Lu Zhang (NewGen Capital, VC), Stuart Blitz (SeventhSense Biosystems), Janica Alvarez (Naya), Jeff Dachis (OneDrop)

  • SeventhSense has TAP, a one-touch blood collection device (for use by healthcare professionals, not consumers currently). The device was just FDA approved. Stuart was formerly with Agamatrix, a connected blood glucose meter vendor.
  • Naya Health has a connected smart breast pump.
  • OneDrop has a subscription service for their bluetooth connected meter, strips and lancet. The device is FDA approved.
  • Discussion on FDA approval, and seeing the FDA as a friend, not the enemy. Naya FDA approval took five months.
  • Why no more apps? The device plus the app is an ecosystem.

SXSW Health Tech Diabetes AvalancheDiabetes Avalanche

I could have elected to wait in the two lines for Joe Biden (one for wrist bands, one to get in) and his cancer moonshot discussion. And, as I found out later, I also could have fanboyed out and found the Game of Thrones session (which I wasn’t aware of) which was right new door to Biden (apparently).

But the statistics and perspectives presented in this SXSW Health Tech session were a reminder of the size of the problems of diabetes and pre-diabetes.

Panel: Dr. Phyllisa Deroze (Black Diabetic Info), Dr. Sarah Mummah (IDEO), Marie Schiller (Eli Lilly), Adam Brown (diaTribe.org)

  • The cancer moonshot (dollars for a cure) was upstairs with Vice President Biden. Adam Brown asks where is the moonshot for diabetes and pre-diabetes?
  • Slide of how large the problem is, and growth rates (see below)
  • A lot of comments on poor diabetes education, and what can be done about it (both websites linked to in the panel list have lots of great education information)

SXSW2017 Health Tech

DunkingOnDisparityPanelDunking on Disparity: Health Tech for All

Panel: Dr. Baker Harrell (It’s Time Texas), Michael Mackert (UT Austin), Nish Parekh (IBM Watson), Stephen Pont (Dell Medical Children’s)

This was a Texas focused session, which featured using technology to reach all Texans. Statistics were presented about smartphone penetration (e.g., there almost everywhere) and the app called “Choose Healthier“, a collaboration between It’s Time Texas and the Dell Children’s Medical Center was introduced. It initially contained events and location information for in and around Austin at the time of the presentation.

The slide below shows stats from a PEW on smartphone penetration from 2016. The point of the panel was that apps could be delivered to all people regardless of income level or demographic factors.

DunkingOnDisparityPhoneStats

 

Health Data: WFT? We’ve been to the Moon, But…!SXSW Health Tech Data Panel

This is the session where I got stuck in an elevator on the way up to the Austin Chamber of Commerce. Lovely! Apparently this is the only way to get up to the chamber of commerce. We weren’t in there for longer than ten minutes, and since it was raining out it wasn’t too steamy…just another bit of excitement at SXSW.

Panel – Brian Baum, Charles Huang, Karen DeSalvo, Sukanya Soderland

This panel had an interesting mix of local and national perspectives, all of whom agreed that data collection is hard but data integration is harder. One of the best slides was one I got a mostly crappy photo of (if you get stuck in an elevator you don’t have the best choice of seats, or so I found out). But it talks about the amount of money that is invested in segments of healthcare that create or utilize data…versus integrating or sharing it. That slide is below.

DataBurden

Karen DeSalvo, the former director of the ONC, shared the goals of data and system integration between the public and private sector. Little discussion on what would happen with these goals with the new administration.
DataPublicPrivate

At this panel, Brian Baum introduced Connected Health Austin, a local initiative. There was discussion on defined data communities within Austin, and all they “solve the same problem differently everywhere” followed by discussion on how Connected-Health Austin would be different in this regard. I heard of several of these type initiatives in Austin during SXSW, hopefully they will all inter-connect.
DataConnectedHealth1

DataConnectedHealth2

To Build in Health, Follow the $ Not the Patient

Panel – Abhas Gupta, Andrew Rosenthal, Carine Carmy and Matt Klitus

The focus here was on providing advice for starting a company in the health tech sector.

  • Discussed using Net Promoter Score, something we see more and more in healthcare for feedback
  • Try shifting money/cost from wellness budget to medical budget. $500 is a large sum for wellness, not at all for medical
  • LTV/CAC – Lifetime Value over Customer Acquisition Cost; LTV is $ per customer per year, how many years, % profit margin. This ratio should be over 3x per Gupta
SXSW 2017

SXSW2017 Equity Crowdfunding

SXSW2017 Equity CrowdfundingNotes from the SXSW2017 Equity Crowdfunding session.

Presenters: Slava Rubin (one of the founders of IndieGogo), Bill Clark (CEO of Microventures)

First Democracy VC is their joint venture that focuses on Equity Crowdfunding that was made possible by Reg CF, Title III. Slava and Bill said all of their ventures thus far has reached their funding goals. A slide they shared (at the end of these notes) shows that as March 2017 about 230 Reg CF offerings have been filed with the SEC (this is since it ‘went live’ in May 2016).

Brief History

Slava shared a brief timeline leading up to the availability of equity crowdfunding.

  • 1933 Securities Act (created SEC, accredited investors, IPOs)
  • 2012 JOBS Act (which contains Reg CF about equity crowd funding)
  • 2016 (May) JOBS Act goes live

Three types of Equity Funding

  • Reg D506 – accredited investors only. Form LLC, can only have 99 people to invest
  • Reg A, mini IPO. Raise up to $50Million
  • Reg CF, Title III (which was created out of the JOBS Act): raise $1 million in 12 months. If you raise over $100,000 requires financial review. Have to use a portal, other rules apply.

Types of CF raises

  • Equity Campaign
  • Revenue Share: Investors get a percentage of sales until the initial investment is paid back. Example: invest $100K, if they are successful, returns $150K, but no equity. This type of CF raise can include perks in an equity campaign (similar to a kickstarter or indieGoGo campaign).

A slide the gents shared on current equity crowdfunding statistics is shown below.

SXSW2017 Equity Crowdfunding Stats

Screen Shot 2017-01-09 at 8.56.04 PM

iTunes Match – fixing “Waiting” and un-downloadable songs

I like the idea of Apple’s iTunes Match service but I’ve had some issues getting it to work the way I think it should, especially with the copies of CDs that I own that I’ve ripped. The main issue is songs showing an iCloud Status of “Waiting” constantly, and those same songs not able to be downloaded to any of my iOS devices. This is what I did to fix it and get it to where all of my songs are uploaded or matched on my OS X and Windows desktops and laptops, and available to download on my iOS devices. Hopefully it will help someone having a similar problem.

The basic fix is to find any song that is in an iCloud Status of “Error” and fix that error, either by deleting that entry, locating the song (if iTunes could not find it) or some other remedy. Since this did fix my problem, I’m assuming the synchronization process between iCloud and the local machine does not handle nor report errors very well, and either timeouts or just fails when it encounters them.

Why iTunes Match

There are several benefits to using iTunes match – when it works:

  • song availability on multiple devices (Mac, Windows Desktop, iPhone, iPad, Apple TV)
  • higher quality versions of songs
  • vinyl conversion. I’ve ripped several of my old LPs (some of which I cannot find in iTunes) and would like to use iTunes Match to make high-quality versions of these songs available across all of my devices.

Issues

The main issue is when songs from albums that I own (CDs ripped, Vinyl converted) are stuck in a “Waiting” state on iTunes for OS X and iTunes for Windows, and show as not downloadable (no “download from cloud button”). This state persists even when I’ve tried to force an update (from the iTunes menu File -> Library -> Update iCloud Music Library. The “Waiting” state looks like the screenshot below.

iTunes Match

Solution

It appears that there was an error in the iTunes “Update iCloud Music Library” process or the normal process to try and match music. But there is no error log. To detect the error, you have to locate at the “iCloud Status” for each song.
To do this and detect error:

  • Turn on visibility of iCloud Status. Select Songs from your Library, then select View -> Show View Options. See the screenshot below.
  • Check the box next to “iCloud Status”

iTunes Match

  • Sort on the iCloud Status column. There should be “Matched”, “Uploaded”, “Waiting”, “Ineligible”, and, in my case, “Error” (see screenshot)

iTunes Match

  • Fix the songs whose iCloud Status is “Error”. In my case there were some that could not be found, and some that were from an old iTunes account. I removed those songs from iTunes, restarted iTunes, and hit the “Update iCloud Music Library” link. Everything that was Waiting became either “Matched” or “Uploaded” on OS X and Windows, and all songs became available for download. Note that there are some titles marked as “Ineligible”. Most of mine were digital booklets, objects that would never sync anyway. These did not need any fixing.

I never found an error log that showed the exact errors, only this indicator in iCloud Status. Since I did this, I’ve had no issues on any of my devices.

Drowzee

Pokemon Go for Runners, Developers and Businesses

RealWorldPokemonWorldBack in the day, my son collected Pokemon cards, played Pokemon on Gameboy, and taught me about Pikachu, Snorlaxes, and other interesting creatures…as I’m sure the kids of many others my age did. As my son grew older, he gave his Pokemon card collection to someone much younger who had more enthusiasm (a very generous move, one he semi-regretted when he saw the prices for some of those cards on eBay!) and moved on to other things. Now in his mid-twenties, my son and I are playing Pokemon Go, semi-together from 200 miles away.

Despite the articles about “nerd herd” and getting the geeks out from behind their computers (which is a pretty good thing, IMHO), in addition to the afore mentioned family camaraderie (and I loudly applaud those friends of mine that are actively playing with their kids), there are other obvious reasons certain people should become familiar with this app/game:

DEVELOPERS

Pokemon Go is the top Free app (with in-app purchases) on the Apple App Store and Google Play Store in the US, the UK and multiple other countries, and has been there since it’s release. It is the fastest app to reach 10 million downloads worldwide, reaching that mark in seven days (source). It also currently leads all apps in daily usage time (i.e., how long do users actually have the app opened). (source).

It did have a bit of a head start in both content and database:

  • The game is built on top of Ingress (which is a game similar in play to Pokemon Go, but with a different story line), which is also a game put out by Niantic. From what I understand, all of the locations and landmarks in Pokemon Go originated from the database that Ingress uses.
  • The content head start is obvious – the previous cards and games provide not only for the 151 pokemon in the current game, but fodder for expansion in later games…and a knowledgable audience familiar with how the game might work.

PokemonGoAppAnniepng

There are some characteristics of the game that are familiar, especially to those familiar with previous pokemon games. But the basics are similar to anyone who has used any count/goal based program: collect everything and level up. This is a common development model, whether it is for a beer drinking app like Untappd (see my breakdown of the Untappd app here), a healthcare/shopping app like Walgreens or game apps. There are badges for most everything (similar to programs like Untappd) though I seem to rarely look at them, other than for counts.

PokemonGoARThere are some characteristics that are missing,

  • there isn’t any type of social sharing (like on Untapped where you can toast a friend’s beer check in, you cannot high five your friends when they get a rare pokemon).
  • a user cannot see their friend in game. Though this would be great for multi-player play, it would certainly complicate the program, and could enable a bit of stalking (if it were done without a type of permission).

These are holes that will be filled, either in future releases or by independent developers. There are already examples of an entire ecosystem springing up around the game; Chat apps (see this developer’s app blog) as an example, I assume to be used to tell people when a rare pokemon is near. There are also several hacks, such as maps that use the app protocols to determine locations of pokemon, pokestops, etc. (most of these can be found in the pokemondev sub on reddit). Some of these are getting shutdown; one even mentioned a “cease and desist” order.

The “augmented reality” piece, where you can use your device’s camera to see pokemon on the background of the real world, is interesting but unnecessary in this game. It is such a battery sucker that I do not know of any players that have not yet turned it off. It is being used primarily as a novelty (I found a pokemon at a landmark) or by businesses to lure pokemon hunters in.

ENTREPRENEURS and INVESTORS

The estimates of how much the game has made the various parties varies. One estimate says that Apple, purely on the percentage that they receive from in-app purchases through the app, will make $3 BILLION in revenue over the next couple of years (source). Since Apple gets 30% of in-app purchases, that would imply an estimate of $7 BILLION in revenue for Niantic (one would assume this gets shared with Nintendo for licensing).

There is, of course, no need to spend money in the game if you choose not to (full disclosure: I do not). Sensor Tower is estimating $1.6 million per day in the US spent. And the app has not yet been launched in Japan where the average spend per mobile user is higher, and the Pokemon craze is even more rabid.

Nintendo’s stock price doubled following the release of the app (chart here) though it has retreated a bit from those highs.

Local businesses are taking advantage as well. Yelp now lets users filter based on pokestop locations. Many shopping areas and downtowns will have multiple pokestops near them. In the game, there are items known as “Lures” which do what the name implies (they lure pokemon to a pokestop for 30 minutes). When this happens, the pokestop lights up on the map, shooting purple pieces up like flares. Small businesses near pokestops are dropping these lures to lure people in while they hunt.

ServerProblemsINFRASTRUCTURE and SYSTEM ADMINISTRATORS

Pokemon Go is almost as well-known these first few weeks for server crashes as it is for having more users than most other applications. Since Niantic spun out of Google, one would assume that they have Google infrastructure. They don’t have Amazon Web Services (AWS), as the Amazon CTO has humorously repeatedly offered health over twitter whenever the servers are down.

As the game added multiple countries over this past weekend (July 16), the servers supporting the game crashed repeatedly, causing the game to be in operable most of that Saturday morning.

The image on the right is all that the players see. There is no notice that the game is having server issues. So users either continue to press “retry” (which comes up after a few minutes of this screen) or kill the app and start over…both of which cause more login attempts and impact on the servers.

From a capacity planning standpoint, one would assume that there would be a trending analysis done on the initial implementation based in the United States before adding in the multiple additional countries. Either this was not done or it was done incorrectly, causing capacity to crash the servers.

This is tolerated somewhat humorously (check out the Pokemon Go reddit forums for examples) for now. But if there are tours, events and other plans made around the app ( as there were that Saturday), this will not be acceptable by the user community for long.

Interestingly as of this writing, Niantic is advertising for a Software Engineer – Server Infrastructure...probably a much needed position just now!

RUNNERS

My fellow joggers: we have an enormous advantage in this game of Pokemon Go. And this infuriates my son…and is the only reason I can even begin to keep up with him in this game (and with the many teenagers that are on summer break and do not have to work). That advantage is that mileage matters in several different facets of the program:

  • To hatch eggs, the player has to travel either 2K, 5K or 10K – depending on the type of egg. This distance cannot be travelled in a car (many have tried) so it certainly favors runners. During these summer months, I average 25-30 miles a week which builds up to a lot of hatched eggs.
  • When using incense (which I call perfume, much to the chagrin of my son), the player will see more pokemon if he moves at a faster rate. I’ve seen some tests where if you are stationary the player will only see a pokemon every five minutes with the incense, but when moving quickly the player sees one every minute. When you do this as a runner, I highly suggest that you make sure you have enough pokeballs.
  • If you have a Lucky Egg (which doubles your XP earning for 30 minutes) this could be a great combination with incense while running. I did this twice, for parts of two separate four mile run, averaging between a 9:00 and 9:45 pace in the lovely South Texas heat and humidity. In the 30 minutes the incense and Lucky Egg was active on the first instance I caught 21 Pokemon (missed 1) and gained 6000 XP. So…not quite one per minute, but not bad. On the second, I caught 25 and missed one, gained 9000 XP.

It may be obvious, but the downsides to running with the game are:

  • Pace is slower (at least mine is) due to distraction. I had been able to flick the pokeballs while running, but it only took running out of pokeballs once to stop that foolishness. Now some of those one-handed throws are acting like curveball throws, without me meaning to throw them. That may be related to the next problem.
  • Down here in Southeast Texas, sweat is a problem. When I run, it is usually 80 degrees and 70-80% humidity. Very irritating to try and throw a pokeball while running with sweat on your fingers. It can be done, but who needs those kind of challenges. And as I mentioned in the previous bullet, I’m seeing some unintentional curveball throws, which may be due to me sweating on the screen.

I have an old Google Glass from an earlier development project. Glass would be a great accessory for this game, and for all games that combine real-world with augmented reality. The ability to see landmarks and have heads-up display facts and stats was one of the benefits of Glass. Unfortunately, the issues it had, particularly with battery life, would have to be fixed. And it had a sweat problem (i.e., sweat be bad for Glass). But image just running along and speaking commands to Glass about throwing Pokeballs…those that make claims of “nerd herd” would have a field day with that one!

My current collection is below. Have fun!

Xcode capitalization problem

Xcode – Simulator vs. Device: CAPITALIZATION matters

There are, obviously and intuitively, differences between testing an iOS app on the Xcode simulator, and testing on a real device. The obvious ones run the gamut from no camera on the simulator to the way the keyboard works differently on both. The intuitive ones, in my mind, come from the fact that the Simulator is running on a different operating system (OSX) than the devices (iOS) that the app is intended for.

The difference that repeatedly bites me is: CAPITALIZATION matters.

The majority of the apps I do at JoSara MeDia are HTML5 apps in a framework called Baker. If you are interested, the rationale behind this is that most of the apps are either coming from books or eBooks (and hence are already in a format close to HTML, like ePub) or are heading in that direction (so we want to make conversion easy).

I was putting in a very nice jPlayer-based audio player (called jquery.mb.miniAudioPlayer, checkout the link, it is quite well done), and it looked great on the simulator, as you can see on the screenshots below. I tested it on several different simulator devices – all looked as expected, all performed the autoplay function, when expected.

Quebradillas Audio 2

In case you are interested, this is from a forthcoming “coffee table poetry book as an app” project called Quebradillas.

Quebradillas Audio 1

But, once I transferred the app to a device (either through cable or TestFlight) the audio player graphics did not make the transition (see screenshot below). And neither did the autoplay functionality.

Quebradillas on Device

 

Xcode capitalization problemThe culprit, of course, was two instance of capitalization. One was in the name of the included css file – in the head of the two pages, the “q” in “jquery” was lower case, and, as you can see from the Xcode screenshot, the file name itself was “jQuery.” This was acceptable in the simulator, which runs on OSX, but would not work (and, interestingly, did not pop up an error anywhere) on the devices tested (iOS). After looking at the javascript code in the jquery plugin, I could see that the “Vm” and “P” were icon placeholders…which lead me to the css file misspelling.

The autoplay issue was, again, capitalization: the parameter in one of the examples had autoplay in camelCase (i.e., autoPlay), but in the mb.miniAudioPlayer.js, the parameter was simply “autoplay.”

 

By noting this, I aim to remind my future self to use capitalization as one of the first items to check when apps look different in the simulator vs. on the device, especially when using HTML5 app frameworks.

AWS Elastic Transcoder

Using Apple’s HLS for video streaming in apps

Quebec CityOverview

All of the apps JoSara MeDia currently has in the Apple app store (except for the latest one) are self-contained; all of the media (video, audio, photos, maps, etc.) are embedded in the app. This means that if a user is on a plane or somewhere that they have no decent network connection that the apps will work fine, with no parts saying “you can only view this with an internet connection.”

This strategy works very well except for two main problems:

  • the apps are large, frequently over the size limit Apple designates as the maximum for downloading over cellular. I have no data that says this would limit downloads, but it seems obvious;
  • if we want to migrate apps to the AppleTV platform (and of course we do!), we have to have a much more cloud centric approach, as Apple TV has limited storage space.

These two issues prompted me to use the release of our Quebec City app as a testing ground for moving the videos included in the app (and the largest space consuming media in the app) into an on-demand “cloud” storage system. I determined the best solution for this is to use Apple’s HTTP Live Streaming (HLS) solution.

There are still many things I am figuring out about using HLS, and I would welcome comments on this strategy.

What is HLS and why would you use it

For most apps, there is no way to predict what bandwidth your users will have when they click on a video inside your app. And there is an “instant gratification” requirement (or near instant) that must be fulfilled when a user clicks on the play button.

Have you ever started a video, have it show as grainy or lower quality, and then get more defined as the video plays? This is an example of using HLS with variant playlists (other protocols do this as well).

Simply put, with HLS a video is segmented into several time segment files (denoted by the file extension .ts) which are included in a playlist file (denoted by the file extension .m3u8) which describes the video and the segments. The playlist is a human readable file that can be edited if needed (and I determined I needed to, see below).

Added on to this is a “variant playlist” which is a master playlist file in the same format that points to other playlist files. The concept behind the variant playlist is to have videos of multiple resolutions and sizes but with the same time segments (this should be prefaced with “I think”, and comments appreciated). When a video player starts playing a video described by a variant playlist, it starts with the lowest bandwidth utilization playlist (which is by definition smaller in size and therefore should download and start to play the quickest, thus satisfying that most human need, instant gratification), determines through a handshake what bandwidth and resolution the device playing the video can handle, and ratchets up to the best playlist in the variant playlist to continue playing. I am assuming (by observation) that it only will ratchet up to a higher resolution playlist at the time segment breaks (which is also why I think the segments all have to be the same length).

Options for building your videos and playlists

There are two links that provide standards for videos for Apple iOS devices and Apple TVs (links below):

These standards do overlap a bit, but, as you would expect, the Apple TV standards have higher resolution because of an always connected, higher bandwidth (minimum wifi) connectivity than one can expect with an iPhone or iPad.

To support iPhones, iPad and Apple TVs, the best strategy would be to have 3 or 4 streams:

  • a low bandwidth/low resolution stream for devices on cellular connections
  • a mid-range solution for iPhones and iPad on WiFi
  • a hi bandwidth/hi resolution stream for always connected devices like Apple TVs

Thus the steps become:

  1. Convert your native video streams into these multiple resolutions;
  2. Build segmented files with playlists of each resolution;
  3. Build a variant playlist that points to each of the single playlists;
  4. Deploy
  5. Make sure that when you deploy, the variant playlist and files are on a content distribution network which will get them cached around the world for faster deployment (only important if you are assuming worldwide customers…which you should).
  6. Put the video tags in your apps.

Converting your native video streams:

My videos are in several shapes and resolutions, since they come from what ever device I have on my at the time. They are usually from an Olympus TG-1 (which has been with me through the Grand Canyon, in Hawaii, in cenotes in the Yucatan and now in Quebec City) which is my indestructible image and video default, or some kind of iOS device. Both are set to shoot in the highest quality possible. This makes the native videos very large (and the apps that they are embedded in larger still).

There are several tools to convert the videos. These are the ones I’ve looked into:

  • Quicktime – the current version of Quicktime is mostly useless in these endeavors. But Quicktime 7 does have all of the settings required in the standards links from the first paragraph in this section. One could go through and set those exactly as specified, and get video output, then put them through the Apple mediasegmenter command line tool. If you do not have an Amazon Web Services (AWS) account, this would most likely be the way to proceed…as long as this version of Quicktime is supported.  To get to these options go to File -> Export, and select “Movie to MPEG-4″ and click on the “Options” button. All of the parameters that are suggested in the standards for iPhone and iPad are available here for tweaking and tuning. Quicktime 7 can still be downloaded from Apple at this link.Quicktime 7
  • Amazon Web Service (AWS) Elastic Transcoder – for those of us that are lazy like me, AWS offers “Elastic Transcoder” which provides present HLS options for 400K, 1M and 2M. This encodings are done through batch jobs, with detailed output selections. There are options for HLS3 and HLS4. HLS4 requires a split between audio and video. This may be a later standard, but I could not get the inputs and outputs correct…therefore, I went with the HLSv3 standards. The setup requires:
    • Setting up S3 repositories that hold the original videos, and destination repositories for the transcodes files and thumbnails (if that option is selected)
    • Creating a “Pipeline” that uses these S3 repositories
    • Setting up an Elastic Transcoder “job” (as an old programmer, I’m hoping this is a reflection on the batch job status of old!) in this pipeline, where you tell it what kind of transcodes that should come out of the job, and what kind of playlist.

AWS Elastic Transcoder

  • iMovie – iMovie has several output presets, but I did not find a way to easily adapt them to the settings in the standards.

Building segmented files

Once you have your videos converted, the next step is to build the segmented files from these videos plus the playlists that contains the metadata and location of the segmented files (these are the files that end with .ts). There may be other tools, but there are only two that I have found.

  • Apple dev tools – Apple’s HLS Tools require an Apple developer account. To find them, go to this Apple developer page, and scroll down to the “Download” section on the right hand side (requires developer login). The command to create streams from “non-live” videos is the mediafilesegmenter. To see the options, either use the man pages (type “man mediafilesegmenter” at the command line) or just type the command for a summary of the options. There are options for the time length of the segments, creation of a file for variant playlists, encryption options and others. I found through much trial and error that using the “file base” option (-f) to put the files in a particular folder, and omitting the “base URL” option (-b) (which I didn’t at first, not realizing that the variant playlist which points at the individual stream playlists can point to file folders to make it neat) worked best. In the end, I used this command to create 10 second segments of my original (not the encoded) files, to create a hi-resolution/hi bandwidth option.
  • Amazon Web Service (AWS) Elastic Transcoder – the Elastic Transcoder not only converts/encodes the videos, but will also build the segments (and the variant playlists). As you can see from the prior screenshot, there are system presets for HLSV3 and v4 (again, I used V3) for 400K, 1M and 2M. The job created in Elastic Transcoder will build the segmented files in designated folders with designated naming prefixes, all with the same time segment. I have, however, seen some variance of quality in using Elastic Transcoder…or at least a disagreement between the Apple HLS Tools validator and the directives given in the Elastic Transcoder jobs. More on that in the results and errors section.

Building variant playlists

Finally, you need to build a variant playlist, which is a playlist that points to all of the other playlists of different resolution/bandwidth option time segments.

  • Apple dev tools – the variantplaylistcreator cmd-line command will take output files from the mediafilesegmenter cmd-line command and build a variant playlist.
  • Amazon Web Service (AWS) Elastic Transcoder – are you detecting a pattern here? As part of the Elastic Transcoder jobs, the end result is to specify either an HLSv3 or HLSv4 variant playlist. I selected v3, as the v4 selection requires separate audio and video streams and I could never quite get those to work.
  • Manual – the playlist and variant playlist files are human readable and editable.

Currently, I am using a combination of Elastic Transcoder and manual editing. I take the variant playlist that comes out of Elastic Transcoder (which contains  400K, 1M and 2M playlists), then edit it to add the playlist I created using the mediafilesegmenter, the higher-res version. This gives a final variant playlist with four options that straddle the iOS device requirement list and the Apple TV requirement list.

Putting the videos into your apps

Most of my apps are HTML5 using a standard called HPUB. This is to take advantage of multiple platforms, as HPUB files can be converted with a bit of work to ePub files for enhanced eBooks.

To use the videos in HTML5 is straightforward – just use the <video> tag and put the variant playlist file in the “src=” parameter.

Results and Errors

In the end, the videos work, and seem to work for users around the world, with low or high bandwidth, as expected. I’m sure there are things that can be done to make them better.

I’ve used the mediastreamvalidator command from the Apple Developer tools pretty extensively. It doesn’t like some of the things about the AWS Elastic Transcoder generated files, but it is valuable in pointing out others.

Here are some changes I’ve made based on the validator, and other feedback:

Screen Shot 2016-02-18 at 8.51.03 PMScreen Shot 2016-02-18 at 8.51.21 PM
Error: Illegal MIME type -
this one took me a bit. The m3u8 files generated by AWS are fine, but files such as those generated from the mediastreamsegmenter tool do not pass this check. They get tagged by the error–> Detail:  MIME type: application/octet-stream”. In AWS S3 there is a drop down list of MIME types in the “Metadata” section, but none of the recommended Apple MIME types are there. The files generated by AWS have the MIME type “application/x-mpegURL”, which is one of the recommended ones. Since it is not a selection in the drop down, it took me a while to determine that you can actually just manually enter the MIME type into the field, even if it is not in the drop down list. Doh!

Time segment issues – whether utilizing AWS Elastic Transcoder or the mediafilesegmenter cmd line tool, I’ve always used 10 second segments. Unfortunately, either Elastic Transcoder isn’t exact or the mediastreamvalidator tool does not agree with Transcoder’s output. Here’s an example as a snip from mediastreamvalidator’s output:

Error: Different target durations detected

–> Detail:  Target duration: 10 vs Target duration: 13

–> Source:  BikeRide/BikeRideHI.m3u8

–> Compare: BikeRide/hls_1m.m3u8

–> Detail:  Target duration: 10 vs Target duration: 13

–> Source:  BikeRide/BikeRideHI.m3u8

–> Compare: BikeRide/hls_400k.m3u8

–> Detail:  Target duration: 10 vs Target duration: 13

–> Source:  BikeRide/BikeRideHI.m3u8

–> Compare: BikeRide/hls_2m.m3u8

This is basically saying the the “HI” version of the playlist (which was created using Apple’s mediafilesegmenter cmd-line tool) is ten seconds, but the AWS Elastic Transcoder created playlists (the three that start with “hls”) are 13…when the job to create them was set for 10 seconds. I am still trying to figure this one out, so any pointers would be appreciated.

File permissions – when hosting the playlist and time segment files on an AWS S3 bucket, uploading the files causes the permissions for the files always need to be reset to be readable (either “Public” or set up correctly for a secure file stream. This  seems obvious, but working through the issues that the validator brought up had me uploading files multiple times, and this always came back to bite me as an error in the validator.

HLS v3 vs. v4 – except for the fact that you have to have separate audio and video streams in v4, I’m still clueless as to when and why you would use one version over the other. It would seem that a single audio stream would be needed for really really low bandwidth. But separating out the video and audio streams is quite a bit of extra work (I would be thrilled if someone would leave a comment about a simple tool to do this). I can see some advantage in separate steams, in that it would allow the client to choose a better video stream with lower quality audio based on its own configuration. More to learn here for sure.

Client usage unknowns – now that the videos work, how do I know which variant is being played? It would be good to know if all four variants were being utilized, and under what circumstances they are being consumed (particular devices? bandwidths?). There is some tracking on AWS which I can potentially use to determine this.

I hope this helps anyone else working their way through using Apple’s HTTP Live Streaming. Any and all comments appreciated. Thanks to “tidbits” from Apple and the Apple forums for his assistance as I work my way through this.

Download on the App StoreTo see the app that this is used on, click on the App Store logo. I’d appreciate feedback especially from those not in the US as to their perceptions of (a)how long it takes the videos to start and (b)how long it take the quality to ramp up.

Edward Tufte Books

Notes and Thoughts on Edward Tufte’s one-day seminar

Dr. Edward Tufte is doing his one-day seminar tour. I sent two of my team to attend on the first day in Austin, and I went on the second day in Houston. If that doesn’t send the message that I think this is a very worthwhile and valuable seminar, then let me be clearer: Dr. Tufte has been and remains the expert in data visualization and he not only keeps up with developments in this area, he explores it and expands it by doing his own developments.

The fee was $380, and includes all for of Dr. Tufte’s books, which cost $100 by themselves. I was not aware that these four gorgeous books were self-published by his own Graphics Press; gives my JoSara MeDia something to aspire to.

And I got all four of his books signed, so the geek fan boy in me is quite happy.Edward Tufte Books

There was no set delineation of the presentation, though in typical Tufte fashion there were handouts and suggested reading during the “study hall” period. I got there early, sat on the front row and had Dr. Tufte come down the row, introduce himself and ask what I did while signing the books. We talked a bit about medical records, EPIC (a large EMR company) and how faxes still dominate the medical field.

Besides geeking out with Dr. Tufte, what did I get out of it?

  • As always, adopted some suggestions for many of our apps, including thinking about the Media Sourcery workflow app as a box score (which it kind of already resembles) and taking my Grand Canyon map experiments into a video panning phase (which I tried once, but looks like the tools have improved).
  • Too often we hear our customers ask us to “dumb it down for the users.” Tufte continually reinforces the opposite of this, that users can handle the complexities, as he goes through his examples.
  • And, of course, his fundamental principles of analytical design (I listed seven, his book BEAUTIFUL DESIGN lists six…so maybe I created one!)
  • The “Tufte-isms” were great, thrown out frequently…so frequently I’m not sure I captured all of them. Quotes from Dr. Tufte are marked like this.

The outline below is my own, just to arrange my notes. They are here for my bad memory, and for your consumption.

Introduction

TufteNoInscriptionsLike most of the world, Dr. Tufte is not a fan of Powerpoint. And the readings assigned in the study hall (more on study hall later) remind of a few of his standard recommendations:

  • Simplification is not the answer, users can handle complexity
  • “Clutter and confusion are failures of design, not attributes of information” (Envisioning Information, pg. 50)

Study hall had several assigned readings in the one hour time set aside, which Dr. Tufte roamed around, signing and talking. Also in the agenda is a set of “Special Interest Topics” (ten sections of these) and two selections of “Homework”. Wonder if I should set deadlines for my guys to get this done… :)

Information Examples

Dr. Tufte went through multiple examples of “information as the interface.” From the seminar page on his website:

Fundamental design strategies for all information
displays: sentences, tables, diagrams, maps, charts, 
images, video, data visualizations, and randomized
displays for making graphical statistical inferences.”

Example #1 – Stephen Malinowski’s Music Animation Machine (try it out at the link)

National Weather ServiceExample #2 – National Weather Service (the base site is linked to, but the site reviewed was a specific forecast, enter a zip code to see the particular page)

mentions little data, with no more little data graphics…which doesn’t force viewers to have to figure out graphs.

Everyone knows how to use, read and view numbers, words and simple graphics.

Tufte: “Minimize design-figuring-out-time, Maximize content reasoning time

A lot of data on one scrollable page. “Humans are good at scanning and recognizing what they are looking for” (example: finding your name on a long list of names)

Tufte: “Being read to from a Powerpoint, the rate of information transfer asymptotically approaches zero.” This guy is witty, eh?

Tufte: “Only two industries call their customers users: illegal drugs and software.” OUCH!

1st: view/eye, 2nd: scroll, 3rd: drill down

Example #3: Policy Story from the NY Times (old article, can’t find online)

Logo and author links show responsibility, accountability, credibility

There are 60 numbers in the story, no graphics, reinforcing the earlier point about that everyone knows how to use/read/view numbers and words…no need to get fancy.

Tufte: “Use experts to get the presentation/article out of your voice and into the experts voice.”

The graph in this article is terrible, sourced from a lobbyist group, with no defensible numbers.

Example #4: Health Article from NY Times (again, an old article). Dr. Tufte does like the NYTimes website, and uses them frequently in examples.

Main point here is a graph of charges vs. Medicare reimbursement. The graphic uses annotations on the graphic, which helps the reader to immediately know how to read the display.

Dr. Tufte continually emphasizing comparing corporate IT properties to Google News, Google Maps, NY Times and WSJ. “Put your IT material next to these. Aim high.”

Espn box score pageExample #5: ESPN.Com World Series page

Using the box score, one level down from the home page. An example of numbers and words, a table of lots of numbers, that is viewed all season for every game and has been for a decade. Great example….though the example he showed didn’t have the cool underwear ad that mine captured! Score!

Tufte makes a point about ordering by interest or mathematical order, not alphabetically.

He ends this segment with my favorite quote: “No matter how beautiful your interface is, it would be better if there is less of it!

How to give a presentation

From the seminar page on his website:

A new, widely-adopted method for presentations:
meetings are smarter, more effective, 20% shorter.”

  • Begin with a document – Paper has the highest resolution of any current medium
  • Every meeting should begin with a study hall – don’t make a big deal about it. Give attendees your document at the beginning of the meeting and ask them to read it. No one reads material passed out before a meeting. Use meeting time for the content.
  • Don’t touch every point in the document. Annotate.
  • Take questions. Take notes on the questions. Directly answer the questions.
  • Example: take a document with your symptoms and questions in to your doctor.

Dr. Tufte shows an Amazon article where they have no powerpoint, all meetings start with a 30 minute study hall. Quote on the article (not sure who to attribute): “Powerpoint is easy for the presenter, hard for the audience.”

Information Architecture

From the seminar page of his website:

“Standards of comparison for workaday and for cutting
edge visualizations. How to identify excellent
information architectures and use them as models and
comparison sets for your own work and for the work
of your contractors. Monitoring the designs of others.”

This section was a bit of a jumble, with various topics, but some excellent examples.

Covered scientific publishing, discussing how jargon is reduced as articles go from the back of Nature to the front of Nature to the more populist web sites and publications. Most people only read the abstracts (so true), and the abstract should state (like a thesis) problem-relevance-solution.

After a break, Dr. Tufte once again mentioned the woefulness that is government and corporate IT Dashboards. He referenced again the ESPN.COM box score page as a easily readable dashboard with tons of numbers as a “standard for comparison.”

The point of an information display – “To help thinking about the content

Example #1: NYTime Article with Annotated Linking

Tufte notes that NYTimes employs 40 “Graphics News Reporters.” Do not use plain un-descriptive lines, use annotations on lines. Also goes through a diagram on pg. 78 of BEAUTIFUL EVIDENCE which uses annotated lines in a diagram tracking SARS patients.

Example #2: Tim Berners-Lee, his original paper suggesting the Internet and linking

Tufte showed the manager’s comment that he originally put on top of Berners-Lee’s paper “Vague but exciting…”. And a good quote from the document, a hierarchy of nouns to the flatness of verbs.

Example #3: xkcd – over lap of items on the front page of college website on left side intersecting with items you really want to know…with only the college’s name in the intersection.

Example #4: Google Maps – again, enforces compare this with your workaday presentations, since everyone uses and knows how to use this seemingly complex app. Compare diagrams to Google Maps. Satellite view – overlay data on top of them.

Example #5: Popular Music chart (from pg. 90-91 of VISUAL EXPLANATION). I found a similar one online, it is shown to the right. It is a flat interface, Tufte showed an iPad version of it in a video and on screen which had the artists names clickable, playing their music with videos behind the diagram. Very cool.

Then he showed the Viz-A-Matic….a graph generate that showed what NOT to do.

Being a Spectator (consuming a presentation)

From the seminar page:

New ideas on spectatorship, consuming reports.
How to assess the credibility of a presentation
and its presenter, how to detect cherry-picking,
how to reason about alternative explanations.

Tufte: “A presentation or graphic should provide reason to believe.”

Tufte: “An open mind but not an empty head.”

Two things in presentations as a spectator – Content and Credibility

Watch out for Cherry Picking (which is not when a Houston Rocket stays back toward his basket waiting for a long pass) – picking only certain details to support points. Also, not linking to the source documents, and being in a “rage to conclude“.

Tufte: “Why go to a presentation whose conclusion you agree with?”

Measurements that are in presentations – have a sense of what is relative. See how the measurements are actually made; get out into the field. See directly. The fog of data will fall from your eyes. Tufte mentions when a chemical company was policing themselves, collecting water samples in clear water…”sampling to please.” People and institutions cannot keep their own score.

Search Google Images for data, the search results are not gamed like the normal Google text searches are.

Finale – Maps moving and Fundamentals

A little bit of talk about displays, and a demo of movie panning over very nice maps of the Swiss Alps. This is something I’ve tried to do with the Grand Canyon app, and I’ll try again.

Dr. Tufte also talked about Small Multiples and Sparklines briefly, but these are covered in detail in his books.

Dr. Tufte demonstrated a tool called “Image Quilts” by Adam Schwarz. It is a chrome plug-in/extension. I played with it, using my friend and artist Barbara Franklet’s images through a google image search.

Barbara Franklet Image Quilt

He closed with his Fundamentals. As stated in the beginning, there are six in the book, but I counted seven.

  1. Show Causality
  2. Make Comparisons
  3. Displays Should be Multi-variate
  4. Integrate All Modes of Information (text, numbers, images, videos, everything)
  5. Document (i.e., show sources)
  6. It’s all about content
  7. Displays are flat, but the world is not

Tufte: “move to web-based presentations. move away from flatland.”

I cannot recommend this seminar and these books highly enough.

Edward Tufte Signature

Re-reading MSandT

Re-reading Tad William's Memory, Sorrow and Thorn

click on the image for more info and to support this blog

Dusk Before the Dawn

Dusk Before the Dawn

Software By the Kilo

Software by the Kilo

Archives

%d bloggers like this: