Developing for tvOS using the Remote Layout Helper

I am developing an app for tvOS. One thing you quickly run into when testing on real world devices is the limits of screensize on several TV models. For this apple added a calibration tool in the settings.

I found I needed this info to actually make my UI layout work for customers. So I took a screenshot and created this screenimage as (PNG) to overlay it at any time in the dev-process for orientation of myself as a Remote Layout Helper).

tvOS Layout Template
Download the Overlay Image (use Save as…)

On my main controller I added & called following method which I call in - (void) viewDidAppear:(BOOL)animated:

- (void) activateCalibrationOverlay {
    if( DEBUG_CALIBRATION_OVERLAY ) {
        if( !overlayImageView ) {
            self.overlayImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"tvOS_screen_size.png"]];
            overlayImageView.alpha = 0.0;
            overlayImageView.userInteractionEnabled = NO;
            [[self appDelegate].window addSubview:overlayImageView];
            [[self appDelegate].window bringSubviewToFront:overlayImageView];
            if( !overlayPlayPauseGesture ) {
                self.overlayPlayPauseGesture = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(handleRemoteTapPlayPause:)];
                overlayPlayPauseGesture.allowedPressTypes = @[[NSNumber numberWithInteger:UIPressTypePlayPause]];
                [self.view addGestureRecognizer:overlayPlayPauseGesture];
            }
        }
    }
}

This adds an overlay of the Remote Layout Helper screen image on TOP of everything else and a gesture recognizer monitoring the PLAY/PAUSE-button of the TV remote. It calls following method:

-(void)handleRemoteTapPlayPause:(UIGestureRecognizer*)tapRecognizer {
    if( overlayImageView.alpha == 0.0f ) {
        [UIView animateWithDuration:0.3 animations:^{
            overlayImageView.alpha = 0.3;
        }];
        return;
    }
    if( overlayImageView.alpha == 0.3f ) {
        [UIView animateWithDuration:0.3 animations:^{
            overlayImageView.alpha = 0.5;
        }];
        return;
    }
    if( overlayImageView.alpha == 0.5f ) {
        [UIView animateWithDuration:0.3 animations:^{
            overlayImageView.alpha = 0.8;
        }];
        return;
    }
    if( overlayImageView.alpha == 0.8f ) {
        [UIView animateWithDuration:0.3 animations:^{
            overlayImageView.alpha = 1.0;
        }];
        return;
    }
    if( overlayImageView.alpha == 1.0f ) {
        [UIView animateWithDuration:0.3 animations:^{
            overlayImageView.alpha = 0.0;
        }];
        return;
    }
}

This helped me a lot doing the right things during development. I hope it helps you too. Feel free to share!

How to use it

You simply press PLAY/PAUSE again and again and the overlay will fade in at different alpha-blending-levels between 0.0 and 1.0. This allows to easily check boundaries of UI elements displayed against the screenlimits.

See following screen example:

The overlay in action on an app displayed.
The overlay in action on an app displayed.

Recognize, that you need a retained variable for the UIImageView called overlayImageView and another for the tap gesture recognizer called overlayPlayPauseGesture to be overlaid. And recognize that I use a boolean Precompiler-Flag named DEBUG_CALIBRATION_OVERLAY to switch this feature OFF in deployment.

Why do I blog this? I found it cumbersome to check against real world TV screens if my app works. So I just made sure that most of my UI is usable from within the MINIMUM SCREENSIZE frame the overlay gfx displays to me. Checking this anytime using the remote is a huge plus also on the real device. (Do not forget to disable the code on deployment!)

Radioaktivität messen mit PocketGeiger App & Detector

Ich hab mal wieder meine PocketGeiger App ausgepackt und den (mittlerweile alten) Type 1 – Detector und mal auf dem Balkon für 20 Minuten gemessen. Ergebnis: 0.02 µSv/h

Mehr Infos dazu gibt es auf der Webseite des Projektes www.radiation-watch.org/ eine Initiative die nach Fukushima ins Leben gerufen wurde.

Bilder der Messung
Quelle: Eigene Aufnahmen auf dem Balkon

Dabei hab ich dann Lust bekommen mir eine Smart Watch zu bestellen.

Und ich bin auf eine aktualisierte Webseite des Bundesamtes für Strahlenschutz gestossen. Die haben neuerdings eine gute Webseite und eine bessere Karte des Gamma Ortsdosisleistungs Messnetzes.

In den USA gibts es vergleichbares offenbar sowohl als Freiwilligennetz als auch als staatliches Netz in Form des Nuclear Emergency Tracking Center und noch eines von der Umweltbehörde.

In Japan gibt es als Ergebnis der Bürgerinitiativen nach Fukushima die Japan Radiation Map.

This map shows ca 4,500 up-to-date radiation measurements, collected from various official sources. On roll-over information is provided for that particular location – radiation levels are visualized by the colored square’s size. Locations marked with the + sign reveal more locations on zoom-in. Logging since march 2011, the accumulated data contains now 100,000,000+ records, available for research.

Messnetzkarte des ODL Netzes
Quelle: ODL Messnetz Webseite des BfS

Ich hab dann mal eine Registrierung für deren API angefordert.

Die App

Diese App gibts übrigens hier: https://itunes.apple.com/de/app/pocket-geiger-counter-pro/id448043815?mt=8

App im AppStore

Weitere schöne Tracking Projekte

Geigerzähler von korrupt mit Beta/Gamma-Hintergrundstrahlung in Wuppertal-Elberfeld

Why do I blog this? Angesichts der Tatsache, dass ich weder Apples Apple Watch sonderlich „smart“ finde, noch ihre Bestrebungen den Standard-Audioanschluss des iPhone zu eliminieren (der für PocketGeiger die ultimative & einzige Schnittstelle ist), wollte ich das mal festhalten wie gut das funktioniert.

RealmExplorer – a piece of sample code for Realm DB

realmexplorer_01_smallFirst things first, look at the RealmExplorer github repo for the project.

Preface: CoreData

I have some history in using databases. From using Oracle, Frontbase, mySQL, PostgreSQL, sqlite etc. the last time I used a database was for a postcard app using CoreData. But I found CoreData to be a lot of overhead and boilerplate code to write.

Especially migrations are kind of not very intuitive to understand and if something goes wrong you are basically lost. While Xcode for a while had an Entity modelling tool (which had it’s roots in the WebObjects EOModeler), most of the time this tool did not work very well for me and had lots of bugs. In each Xcode version different bugs!

Also many people report running into issues when using CoreData with lots of objects, because it is easy to loose track of memory consumption with CoreData and if you do not actively check and work against that, you end up with a whole model graph being in memory. Not cool! So I wanted to try something different.

Realm

I found Realm by Realm.io to be interesting enough to tinker with. So I started working with their sample code projects. But none of those projects came even close to some real world example. So I just started prototyping something I will need for a different project anyway.

Starting with a custom viewcontroller for adding & editing entities I recognized that this would soon become rather painful to provide e.g. inputfields for text and numbers and dates. So I grabbed FXForms to give it a try after I had already completed a viewcontroller to manage DBCaptain entities a bit.

Migrations

Right from the start I wanted to know how migrations work. Because usually those are the weakest spot in every db operation taking place. With a live product you always „pray“ that nothing will go wrong during migrations on the customers device. And keeping track of changes to the db schema is very important to have a stable app.

So I built the sample around the process to „simulate“ an app in development. You can now iterate through different development steps (5 different app states & their migrations) and their needed migrations. This way I figured out how I can safely manage my migrations (i.e. always keeping a backup of the old schema). In the project the
DatabaseFactory.m class manages all that. It detects existing schemas using a precise naming scheme for each Realm-db created.

During app launch any necessary migrations are executed if necessary. Also, any errors happening while the app launches and gets into trouble activating the database, are nicely displayed.

Create/Insert/Edit/Delete

I started with one entity DBCaptain which is basically a user entity. I just wanted to create user objects and populate them with different properties. I tested insertion with one and with many users on the main thread and on the background thread. I also took advantage of notifications coming from Realm as soon as an operation finished updating the db.

Deletion of objects came next. As always having the UI keeping track of all those changes is the more difficult task. So expect some „ugly“ code, because this was only me tinkering around. When i wanted to edit existing objects I recognized that I do not want to create all those textfields manually.

FXForms

That was when FXForms came to the rescue. I started to recreate the already half done viewcontroller and created another based on FXForms. I needed some time to wrap my head around FXForms, because basically this is a quite sophisticated hack of providing compact descriptions of UITableViewCells and their content.

You describe what kind of structure your form should have and what kind of value types it should use. FXForms crafts the UITableView which makes all those inputfields come true. Things start to become a bit tricky if you want to control those UITableViewCells more directly. E.g. I needed a way to exclude certain properties of being edited at all. So I simply extended the FXForms protocol to allow for a denial of userInteractionEnabled on certain cells by adding the FXFormFieldEditable key. This is used on the CaptainFormViewController to exclude the information about the database schema and the encryption status to not be editable.

Evaluation

My first impression of Realm DB is quite positive. It is fast, it provides helpful error-messages and it is easy to setup and comprehensive in what it does and how it works. I never in my life had migrations up and running that quickly. Very, very helpful was the Mac app RealmBrowser to open a realm DB file directly. I used that while checking my migrations and the results of operations which added arbitrary data (e.g. image files).

What is actually really great is that encryption is supported for the db by just providing a 128 bit key. That is really helpful to protect user data and increase safety of the realm db file even if it gets backed up into some cloud somewhere else.

What’s next?

I think I missed something related to the KVO-thingy going on by Realm. But I had no time until now to figure that out. I suppose they actually track entities and changes via KVO. But I am not yet sure how exactly this works and how I can detect if an object has pending changes. That is why I added a hasPendingChanges BOOL-flag to both ViewControllers for editing entities. So help is welcome here!!

I did not have time to do some real hardcore performance testing, i.e. adding thousands of entities and relationships in a short time and crafting a rather complex model with complex relationships which define ownership and cascading deletions etc. Also doing sophisticated db queries is on my list to do next.

But having a prototype now to tinker with helps already a lot. Since I learn best from sample code, and I think I am not the only one learning new things this way, I hope it helps others to jumpstart with Realm, too.

Why do I blog this? I just wanted to give a bit of background info on the sample code I put on github. Have fun tinkering & I am really in need of getting more info on how I can detect changes to an object so I know WHEN to save changes to the db. Maybe someone who already grabbed the concept better can help me here?? Leave a comment or contact me via github.