Design Concepts

Profile Protoype Exploration

I love Principle for Mac. It has quickly become one of my favorite prototyping tools. I often use it really early on when exploring design concepts. This is an example of a profile study I put together to explore different interactions. You can try (and then buy!) Principle here: http://principleformac.com/

principle


Apple Watch Concept: The Usual Coffee

Every day during the work week at 3pm, a group of people at our Black Pixel team go get coffee together at "Coffee O'Clock". Besides our CTO mixing it up often, we tend to get the same thing. The barista even knows what we're going to get, but always asks just to confirm. I was looking for a Watch idea to play around with, and it made me wonder, "What if there was an app that would let me order my coffee before I get in line?

In this use case, I'm using one of my favorite coffee shops, Sight Glass, as an example for an app.

The assumption is a wearer would walk into the establishment and have location on. The app would recognize that the wearer is near the location and push that data to Apple Watch. The ida is a simple app for a wearer to let the establishment know if he or she is ordering the usual so the baristas can queue up the orders in hope to create a better workflow.

Below is a design for three sample views: Short Look Notification, Long Look Notification, and Long Look Notification scrolled. Some design notes:

  • The transition from Short to Long Look Notification occurs if the wearer taps on the screen or continues to hold the watch up.
  • When scrolling, the body will scroll under the sash.

watch

Note that the assumption would be that there is an iPhone app since Apple Watch apps interact with WatchKit via a WatchKit extension.

Untitled-2

WatchKitExtension

Some notes for designers when working on Apple Watch apps:

  • Do not design the experience like an iOS app. This is a different context than a phone. Often I have seen Watch app concepts too over-designed.
  • The wearer will probably interact with Watch very briefly. My mindset has been "How can someone use this in less than one second?"

Apple has provided their Human Interface Guidelines as well as very detailed resources.


Creating iOS effects using Axure RP

"Theatricality and deception are powerful agents."

The famous words of Henri Ducard in Batman Begins. A prototype is intended to test the concept so it is okay to use a bit of smoke and mirrors to get your point across. When I get asked what I use for prototyping, I simply say "whatever I have around me." Though I use all sorts of tools for prototyping, Axure RP is one of my favorites. This blog post will show you how to create an illusion of some iOS 7 effects, such as blurring, swiping and calling modals.

In order to do this you'll need Axure RP and a image editing tool (I use Photoshop) to create the assets. You will also need to be a bit familiar with Axure already. If you are not, they have a great set of tutorials on their website.

Basic Approach

Axure is by nature a prototyping tool for interaction and not necessarily animation. When I want to prototype animations I will use Adobe After Effects or Quartz Composer with Facebook's Origami. However, Axure does have some simple animations that are great and get the job done, but if you want to create a pretty realized prototype, creating some original assets in Photoshop and combining them with Axure will be very powerful.

The reason I take this approach instead of doing the entire prototype in After Effects is because of the prototyping framework. Axure will allow testers to interact with the prototype non-sequentially instead of playing a continuous timeline.

Blur Effect

To my knowledge, there is no way to create blur effects in Axure. What I do in Photoshop is to create two different states of one images. In this example below you will see that I have one state where the image is static and the other is one treated with a gaussian blur in Photoshop. It can be a bit tedious to create an image for every state, but it won't take too long if you know your way around photoshop and the result will be a very naturalistic prototype.

assets

In my project I created three sets of images to show navigating between different products. All you need to do is:

  1. Create a dynamic panel.
  2. Create two states: one with the static image and the other for the blurred image.
  3. When you want to initiate the interaction, do a panel state change and set the effect to "fade". This will create the illusion of the image being blurred when in reality the static image is fading away and revealing the blurred one.

Assets

 Swiping Between Products

Photo Apr 23, 12 33 48 AM

In this project I made the view swipable so you can move between different products. Here's how:

  1. Create a dynamic panel and put all of your products in it as panel states.
  2. For the interaction set On Swipe Left to show the next panel state and On Swipe Right to the previous panel state.
  3. I have the animation of both the entering and leaving elements moving in the same direction. This creates the illusion of swiping in iOS.

Modals

To create a modal in Axure you will treat it as a flyout. Doing this one is really simple; create a dynamic panel, hide it, and have it show on an interaction. Set the animation to "slide up" or something else you prefer.

Photo Apr 23, 12 32 10 AM

Revealing Details

In this prototype I use On Click but another idea I had was to reveal this on tap and hold. Like the modal, this is simply a hidden dynamic panel that reveals in a fade.

.Photo Apr 23, 12 37 40 AM

Here is the prototype (open on your iPhone for the best experience). I also did a screencast using AirServer—highly recommend it.

Well there you have it—a few different ways to create iOS effects using Axure. If you have any questions or have any request to create any other interactions, leave a comment! A special thanks to Fortnight Lingerie for allowing me use their branding in this prototype—very much appreciated.


Home Interfaces: The Bathroom

At Launch this year, Yves Béhar was one of the fireside conversations and he was phenomenal. He spoke a lot about design for the home. Béhar emphasized how vital simplicity is when it comes to experience in the home. The reason? Because any interface has to compete with the simplicity of the light switch. This inspired me to start thinking of mobility vs. mobile. I began brainstorming about what sort of interfaces would be useful for me and (hopefully) other people. The idea of interactive mirrors or walls became very appealing to me. It was something that served another purpose but could be called to serve another at a certain given point.

Like most people, most of my morning getting ready in the morning is in the bathroom. This is where I think about things I need to do, ask Siri what the weather is like and figure out what I'm going to wear.

I thought: "What if we could have an interactive mirror in the bathroom that could display useful information as we are getting ready for the day?"

Activation

The first question that I pondered was "How would this mirror be activated?" Going back to the light switch analogy, I had to think about simplicity. It can't be a screen that needs to be booted up or played over wifi with another device. The light switch already wins. It has to be automatic. Some thoughts:

  • Interface is initiated when a user with a bluetooth enabled device (phone or wearable) enters the bathroom. If there are two people, then the screen interface can split for each person.
  • A smart toothbrush could be installed. When a user removes the toothbrush from the stand, it will identify the person by toothbrush and activate the interface. If possible, the toothbrush would take a saliva sample and display the person's biometrics on the mirror.
  • Voice activation. This would be a simple thing to calibrate and is a pretty solid way of authentication. From a security perspective I think this method would work because it is not displaying confidential account information and the assumption is the locked door of the house/apartment is still the primary method of security.

Features

Because the use case is a person getting ready for the day, the features of this should be a high level overview—nothing too comprehensive. Some features I came up with are:

  • Weather. You would be surprised how many times I might go into the bathroom to get ready for the day before I even look outside. This interface would display the weather outside and even provide a forecast for the week.
  • The interface would make recommendations of what a person should wear based on the weather. Initially this might be a suggestion of general items, such as "Wear a jacket with a scarf." If there was some inventory API that has every article of clothing the user owns then it could recommend specific attire.
  • Health metrics would be displayed and the interface can provide recommendations of activities or meals to eat.
  • Display a to-do list (not pictured). This could be plugged into an existing service like OmniFocus.

What are your thoughts? Would you use a product like this? How much would you spend on it if you would buy it? This is the first concept of the mirror and I will continue to iterate on it. I am hoping to design a prototype that can be dropped into After Effects to give you a video demo of what it would be like with someone using it.


Interactive Fitting Room Concept

I have been recently working on a lot more products that are interactive spaces as opposed to devices. A lot of these are mirrors. In the past few weeks I pondered how certain spaces can be enhanced with an interactive experience juxtaposed with a physical location. The fitting room is one.

After some sketching and thinking, I tried to come up with some solutions that could remove friction from the fitting experience and buying the products. For me personally, one big pain point is trying on clothes and realizing they don't look good on me. I played around with the idea of a "preview" function. The idea for this is to have the mirror act as a catalog for the user to view different products while in the room as opposed to going to the rail and fitting room. Someone could simply press "preview" and the article of clothing would project in the mirror and show the customer what it would look like on him or her.

Ideally, there would be 3D renderings of the apparel so the person could turn around and see how it would look from a 360 degree perspective. If a person really wanted to try it on for real, he or she press "try it on now" on the mirror and a sales associate would bring the item in the back.

This sort of experience could translate into a mirror at home. A person could simply preview a few items, have it shipped to his or her house or have it ready at the local retailer.

interactive-fitting