tvOS Parallax

With the release of tvOS, a primary focus (pun intended) is on layered images. Apple provides a really nice parallax preview tool for your app's assets. The model in this prototype is Emily Ratajkowski, which I jokingly call this Ratallax.


Lembas iPhone App Prototype

Since it's New York Fashion Week I have been thinking a lot about digital products that could enhance the shopping experience of a shopper. My friend Lana is the owner of a store called Lembas, which sells handmade jewelry and bridal accessories. I asked her if I could use her store as an example of a prototype to build. She also makes some amazing products, so consider shopping there!


One problem often encountered in shopping is a way to preview or get a vibe of how something might look on you. Context and occasion are key. I decided to explore ways to integrate a preview experience that seems supplementary to the use of the app; remaining useful while not being the primary focus.

In this prototype I knew I wanted to use the iPhone's native camera view and decided to use Form, a prototyping tool by Relative Wave (recently acquired by Google). At first glance Form has many similar aspects to Quartz Composer. I like to work with nodes and patches because I like to visualize prototypes in a radial fashion.


I essentially created a camera view that lives at the bottom z layer of an app, almost like a drawer you could access as if you were reaching for a mirror.

In an ideal world, I would get 3D rendered images of the jewelry, but since this is a prototype I took one of Lana's photos to mask out the object.


For prototyping, I often work with Sketch paired with a prototyping tool in parallel. The level of fidelity I use is as much to get the idea across.




Here is a quick video of the prototype I put together.

You can learn more about Lembas here. Follow on:

Cell animation prototypes

I've been inspired by my friend William Van Hecke (User Experience Lead at Omni Group) who has been posting daily on his blog about thoughts on design. I will try to follow his example by writing daily as an excuse to use my 12" MacBook (yay) but may not post everything. However, what I will do instead is create a design prototype once a week. I'm doing a presentation in July about creating prototypes with Origami and will share these prototypes eventually on my GitHub account.

This project is an exploration of the behavior of cells. In this fictitious app I explored the idea of custom cells that can expand. This allows a user to tap on a list of people to contact them.

I also wanted to prototype the behavior of marking something as done. This gesture recognizer interaction is popular among many 3rd party apps on iOS. In this prototype, swiping slightly will be conditional and show a user a checkmark indicating that "This will be marked as done".

Apple Watch Concept: The Usual Coffee

Every day during the work week at 3pm, a group of people at our Black Pixel team go get coffee together at "Coffee O'Clock". Besides our CTO mixing it up often, we tend to get the same thing. The barista even knows what we're going to get, but always asks just to confirm. I was looking for a Watch idea to play around with, and it made me wonder, "What if there was an app that would let me order my coffee before I get in line?

In this use case, I'm using one of my favorite coffee shops, Sight Glass, as an example for an app.

The assumption is a wearer would walk into the establishment and have location on. The app would recognize that the wearer is near the location and push that data to Apple Watch. The ida is a simple app for a wearer to let the establishment know if he or she is ordering the usual so the baristas can queue up the orders in hope to create a better workflow.

Below is a design for three sample views: Short Look Notification, Long Look Notification, and Long Look Notification scrolled. Some design notes:

  • The transition from Short to Long Look Notification occurs if the wearer taps on the screen or continues to hold the watch up.
  • When scrolling, the body will scroll under the sash.


Note that the assumption would be that there is an iPhone app since Apple Watch apps interact with WatchKit via a WatchKit extension.



Some notes for designers when working on Apple Watch apps:

  • Do not design the experience like an iOS app. This is a different context than a phone. Often I have seen Watch app concepts too over-designed.
  • The wearer will probably interact with Watch very briefly. My mindset has been "How can someone use this in less than one second?"

Apple has provided their Human Interface Guidelines as well as very detailed resources.

Sharing Interaction in Quartz Composer


I was experimenting with ways of revealing social sharing options on the iPhone while playing in Quartz Composer (Origami). In this example app (not a real app) there is a group of images that show which of your friends have listened to this song as well. The share icon is on the right of it.

  • I created a behavior that when the sharing options are expanded, the profile photos of friends would collapse to de-clutter the space.
  • I wanted the share icon to transform into a close button. It was really important to make sure the close button replaced where the share button was so the user would not have to move his or her finger/thumb anywhere to dismiss it.

How I Did It

The animation itself is quite simple. My intention was to have the social icons seem like it exploded out of the share button.

  • Alpha is set at 0.0 and turns to 1.0.
  • Scale is at 0.8 and turns to 1.0
  • Each icon is given its own unique z-rotation to give it the impression that they are moving independently from one another (as they are).
  • I used a Bouncy Animation to give the animation a bit of friction and tension.



Creating iOS effects using Axure RP

"Theatricality and deception are powerful agents."

The famous words of Henri Ducard in Batman Begins. A prototype is intended to test the concept so it is okay to use a bit of smoke and mirrors to get your point across. When I get asked what I use for prototyping, I simply say "whatever I have around me." Though I use all sorts of tools for prototyping, Axure RP is one of my favorites. This blog post will show you how to create an illusion of some iOS 7 effects, such as blurring, swiping and calling modals.

In order to do this you'll need Axure RP and a image editing tool (I use Photoshop) to create the assets. You will also need to be a bit familiar with Axure already. If you are not, they have a great set of tutorials on their website.

Basic Approach

Axure is by nature a prototyping tool for interaction and not necessarily animation. When I want to prototype animations I will use Adobe After Effects or Quartz Composer with Facebook's Origami. However, Axure does have some simple animations that are great and get the job done, but if you want to create a pretty realized prototype, creating some original assets in Photoshop and combining them with Axure will be very powerful.

The reason I take this approach instead of doing the entire prototype in After Effects is because of the prototyping framework. Axure will allow testers to interact with the prototype non-sequentially instead of playing a continuous timeline.

Blur Effect

To my knowledge, there is no way to create blur effects in Axure. What I do in Photoshop is to create two different states of one images. In this example below you will see that I have one state where the image is static and the other is one treated with a gaussian blur in Photoshop. It can be a bit tedious to create an image for every state, but it won't take too long if you know your way around photoshop and the result will be a very naturalistic prototype.


In my project I created three sets of images to show navigating between different products. All you need to do is:

  1. Create a dynamic panel.
  2. Create two states: one with the static image and the other for the blurred image.
  3. When you want to initiate the interaction, do a panel state change and set the effect to "fade". This will create the illusion of the image being blurred when in reality the static image is fading away and revealing the blurred one.


 Swiping Between Products

Photo Apr 23, 12 33 48 AM

In this project I made the view swipable so you can move between different products. Here's how:

  1. Create a dynamic panel and put all of your products in it as panel states.
  2. For the interaction set On Swipe Left to show the next panel state and On Swipe Right to the previous panel state.
  3. I have the animation of both the entering and leaving elements moving in the same direction. This creates the illusion of swiping in iOS.


To create a modal in Axure you will treat it as a flyout. Doing this one is really simple; create a dynamic panel, hide it, and have it show on an interaction. Set the animation to "slide up" or something else you prefer.

Photo Apr 23, 12 32 10 AM

Revealing Details

In this prototype I use On Click but another idea I had was to reveal this on tap and hold. Like the modal, this is simply a hidden dynamic panel that reveals in a fade.

.Photo Apr 23, 12 37 40 AM

Here is the prototype (open on your iPhone for the best experience). I also did a screencast using AirServer—highly recommend it.

Well there you have it—a few different ways to create iOS effects using Axure. If you have any questions or have any request to create any other interactions, leave a comment! A special thanks to Fortnight Lingerie for allowing me use their branding in this prototype—very much appreciated.

Home Interfaces: The Bathroom

At Launch this year, Yves Béhar was one of the fireside conversations and he was phenomenal. He spoke a lot about design for the home. Béhar emphasized how vital simplicity is when it comes to experience in the home. The reason? Because any interface has to compete with the simplicity of the light switch. This inspired me to start thinking of mobility vs. mobile. I began brainstorming about what sort of interfaces would be useful for me and (hopefully) other people. The idea of interactive mirrors or walls became very appealing to me. It was something that served another purpose but could be called to serve another at a certain given point.

Like most people, most of my morning getting ready in the morning is in the bathroom. This is where I think about things I need to do, ask Siri what the weather is like and figure out what I'm going to wear.

I thought: "What if we could have an interactive mirror in the bathroom that could display useful information as we are getting ready for the day?"


The first question that I pondered was "How would this mirror be activated?" Going back to the light switch analogy, I had to think about simplicity. It can't be a screen that needs to be booted up or played over wifi with another device. The light switch already wins. It has to be automatic. Some thoughts:

  • Interface is initiated when a user with a bluetooth enabled device (phone or wearable) enters the bathroom. If there are two people, then the screen interface can split for each person.
  • A smart toothbrush could be installed. When a user removes the toothbrush from the stand, it will identify the person by toothbrush and activate the interface. If possible, the toothbrush would take a saliva sample and display the person's biometrics on the mirror.
  • Voice activation. This would be a simple thing to calibrate and is a pretty solid way of authentication. From a security perspective I think this method would work because it is not displaying confidential account information and the assumption is the locked door of the house/apartment is still the primary method of security.


Because the use case is a person getting ready for the day, the features of this should be a high level overview—nothing too comprehensive. Some features I came up with are:

  • Weather. You would be surprised how many times I might go into the bathroom to get ready for the day before I even look outside. This interface would display the weather outside and even provide a forecast for the week.
  • The interface would make recommendations of what a person should wear based on the weather. Initially this might be a suggestion of general items, such as "Wear a jacket with a scarf." If there was some inventory API that has every article of clothing the user owns then it could recommend specific attire.
  • Health metrics would be displayed and the interface can provide recommendations of activities or meals to eat.
  • Display a to-do list (not pictured). This could be plugged into an existing service like OmniFocus.

What are your thoughts? Would you use a product like this? How much would you spend on it if you would buy it? This is the first concept of the mirror and I will continue to iterate on it. I am hoping to design a prototype that can be dropped into After Effects to give you a video demo of what it would be like with someone using it.