Sunday, November 3, 2013

Product as Cultural Force

The Walkman, the Macintosh, the NES, the Gameboy, the Nokia 2110, the Playstation, the iPod, the Razr, the iPhone, the Kindle, the Macbook Air, the Kinect, the iPad...

Aside from being all household names and hit products, what do these products have in common?

Well for one, they have have far bigger footprints than the sum of their sales. These products are not just milestones in the history of technology, but archetypes in themselves. These are products which occupy a disproportionate amount of consumer mindshare, products by which all their competitors are measured, to the point that even their best competitors are just relegated to being *insert name here* killers. These are cultural force products.

The pinnacle of consumer technology success is to be able to make products that could be cultural forces. It's more important than being the first, or even being the best, although these could certainly be excellent prerequisites to being a cultural force. Having a cultural force product gives a halo to all the company's other products which most likely will not have to wear off for years. They take company investments much farther than regular products. Companies even resort to using brands of cultural force products long after they're gone. Cases in point are Sony Ericsson's Walkman phone or Motorola's Droid Razr. Looking at the history of technology, the turning points of companies from good to great usually coincided with the release of a cultural force product.

Creating such a product might be a surefire formula for greatness, but it might also be the most difficult thing to do. What is the secret to making a cultural force product? Cool technology? Price? User Experience? Brilliant advertising? A combination of the above?

Let's take the exception out first - the video game console industry. In here, you have so few players and so much differentiation that there can be room for 2 or 3 cultural forces. Consoles only get refreshed once a generation, so they somehow get embedded in the public imagination. 

The most obvious path to being a cultural force is to be the first in a category, and it does work even with a poor 1st gen product, such as the Kindle. But then again, there are also countless examples where it doesn't. Who remembers the first MP3 player? Or the first digital camera? Being the first e-book reader did give the 1st gen. Kindle huge leeway in the public imagination. It bought their time to refine their product while still retaining consumer mindshare. That's why companies are always scrambling for this position. 

Related to being the first is coming up with a whiz-bang technology straight from the future. The Kinect is an example. It's guaranteed to generate a lot of press, and that is about halway to becoming a cultural force product. Google for example wanted to be assured of this spot, so they announced Google Glass way before it was market-ready.

Aside from being the first, other superlatives don't go as well. The tech industry goes so fast that taking the world's thinnest, world's smallest, or world's fastest crown only takes a few months. Who knows about the reigning world't thinnest smartphone, Huawei Ascend P6? What happened to the world's highest resolution smartphone camera, the Nokia Pureview 808? Superlatives only matter insofar as they significantly improve the consumer experience.

A straightforward but dangerous path is a crazy price. This is the magic of Xiaomi. It was only released in 2011, but it's since become a massive cultural force in China. It's certainly not the best phone, but it could well be the best at its price point. If there's any one superlative which  people respond to more than anything, it's price. It was achievable for Xiaomi because they had the chance to create a new business model optimized for cost-cutting from the ground up, but it wouldn't be so easy for the incumbents.

Being the best device is the obvious golden ticket, but this is so subjective and so many factors have to be squeezed in that in effect it doesn't say much. WebOS for example was highly regarded as the most intuitive and user-friendly, but it sputtered and died not long after it was released. The Razr for example wasn't even the best phone of its time, but it became a cultural force through timing, the right appearances (Oscar's), and an out-of-this-world (at that time) design. 

Apple is a master at this. The reason why Apple is so admired is that their entire product lineup are cultural forces in themselves. The Mac, the iPhone, the iPod, and the Macbooks. What could it be? They're definitely not the biggest advertisers. They play in crowded markets. They're almost never first in any category. They don't excel in superlatives. And their prices are crazy, but in the other way. One thing they're great at though is balancing constraints and framing products as experiences. They're great at simplifying a product to the point that only the archetype is left. That's why even though Apple is hardly the first at any category, they somehow get to define that category.

Even so, these are not enough. What makes Apple products so compelling is that they also come with a mythology not found in other products, stories that stick like gum to consumers' minds - the vision of Steve Jobs, the minimalism of Jony Ive, the rebellious ethos of "Think Different", the approval of so many respected pundits. This combined with Apple's archetypal products make a very potent combination.

Being integrated also brings a huge cultural force advantage to Apple. Because they design the whole 'widget', people do not get to fragment Apple devices in their minds. The iPhone is iOS, and iOS is iPhone. It's not like Android where beneath it there's still so many choices. The paradox of choice in the post-PC era make cultural force products more important than ever. You need to define your territory, and Apple by who they are, get to take the entire iOS territory all by themselves. With Apple, you're not choosing a product, a brand, or even an experience, you're choosing a lifestyle.

Samsung has been trying very hard to create a cultural force product of their own. It's probably the reason why they released the Note in the first place, to set down their flag on the "phablet" territory. It's likely also the reason why they released the Galaxy Gear smartwatch way before it was market-ready. Being first in a category goes a long way to the cultural force finish line. Obviously, Samsung doesn't know how to do it, but they have a back-up plan: gargantuan investments in advertising. They pour in so much money that the public doesn't have a choice but to think about Samsung. I also think that their big courtroom battle with Apple was a blessing in disguise for them. Given that human beings are susceptible to dualistic black vs white thinking, people started to see Apple and Samsung as rivals on opposing ends. Samsung became the de facto Apple rival and thus what people immediately think of after a mention of Apple.

These companies couldn't be more different. Samsung is taking the brute-force path, while for Apple it's embedded within its DNA. It's the only company with the determination to take out a feature which would have been cool but otherwise wouldn't have made the product to go all the way. It's the only company with the drive to create an entire media ecosystem just to make their products that much easier to use, and the only company with the craziness to create a brick-and-mortar retail chain in the age of e-commerce just to let people play with their products. Most importantly, it's the only company with the discipline to withhold a product release until it is ready to become a cultural force. That's why all this talk about Apple lagging behind Samsung in innovation is short-sighted in my opinion. Why is Apple taking so much time to release a smart TV or a smartwatch? Releasing a product is easy, but releasing a product with the optimal combination of price, technology, consumer expectations, features, and timing, this is the most difficult thing to do. Apple understands that this long, arduous path is what it takes to come up with true cultural force products.

Friday, October 4, 2013

At the Tail-end of Smartphone Innovation

For the longest time, smartphone manufacturers have had a clear trajectory on their feature roadmaps. Tweak this spec and that, and the consumer cash will keep rolling in. Well, we have now reached a point where the spec wars are about to come to an end. Not just because consumers now care less about numbers, but also because the specs themselves are nearing their functional zeniths. Let's look at each major spec battlefield and see how close they are to that point.

1) Pixel density

Real innovation on this front pretty much ended when Apple first came up with the Retina display back in 2010. I mean, how can you beat something that already beat the limits of the human eye? Yes, the other OEMs caught up, but often to the point of absurdity. The pixel wars of today go well beyond retina territory, but whether it's 326 ppi or 441 ppi, it doesn't matter so much anymore. If anything, it mainly serves as marketing ammunition. I can't imagine what else they can do next year except to improve the harder-to-market qualitative aspects of sceeens like brightness and color saturation.

2) Screen Size 

Barring Apple, the OEMs have somehow convinced the public that bigger is more advanced and hence, more expensive. The Galaxy S series serves as an excellent industry yardstick. The first Galaxy S was a 4-incher. It then grew to 4.3 inches with the S2, 4.8 inches with the S3, and now 5 inches with the S4. I wonder if they would cross over to phablet territory next year. If so, then the S pen would be the only differentiator with the Note series. Speaking of phablets, it's also a wonder if they would still grow bigger to become full-fledged tablet phones. I doubt this though since we have this curious phenomenon where consumers pay a lot less when they perceive the product to be a tablet rather than a smartphone.

3) Processor Speed 

I see desktop processor speed parity as the would-be ceiling for mobile processors. Of course, it's not an apples to apples comparison since it's essentially x86 vs. ARM. It's also true that a phone can never be too fast, but I think that since the dual-core wars of 2011, we've crossed the point of diminishing returns. Moving to quad-core in 2012 was already overkill, and the release of octacore this year is just insane. Device performance is supposed to go beyond a mere battle of cores and clock speeds, but companies still find this to be their feature roadmap's low hanging fruit even when they don't mean much anymore. Apple provides a breath of fresh air here, adamantly going with lower clock speeds and cores yet still blazing the trail in performance. This year, it opened up a new battlefield venue with 64-bit mobile chips, so at least for the coming year there's still something for manufacturers to hold onto. Beyond that, it's cloudy.

4) Camera Quality and Megapixels

This battle is not just for smartphone cameras but for cameras in general. No matter how many times it's been clarified, the general public still somehow believes that more megapixels lead to better pictures. This year, we're in the midst of the 13 megapixel battle. There are even standout ones with more megapixels like the 20.3 megapixel Sony Xperia Z1 and the 41 megapixel Nokia Lumia 1020 which unfortunately just propagate the megapixel myth. Consumers have to understand that these are outstanding cameras not because of the megapixel count but because of the sensor size, OIS, and bigger pixels. It's nice to see Apple bowing out of the race with the iPhone 5s, which 'only' has 8 megapixels but takes amazing pictures. In general,the camera is still a fertile battlefield, and with the rise of photo sharing, one that is more important than ever. The room for improvement is still huge, with the ceiling being SLR-caliber point and shoots like the Sony RX100.

5) Battery Life

The most fertile ground in my opinion. It's the one aspect consumers care a lot about yet very little is being done. In fact, the whole smartphone revolution has been a big step backward for battery life. We've seen some promising solutions such as the Razr Maxx and the Note series, but whoever can genuinely solve this issue will be hugely rewarded by consumers in the years to come.

6) Removable battery and expandable storage

Another big step backward for the industry. Most OEMs who have gone the Apple way turned out to shoot themselves in the foot doing this because the anxiety of common consumers over storage and battery life outweigh the benefits of doing without them like beauty and thinness. Actually, I surmise that the real reason is because OEMs want to earn more for their higher storage capacity SKUs, something Apple has hugely profited from. I reckon it's why consumers are now flocking to Samsung in droves, because it's one of the few companies to really understand that consumers come to Android mainly for the flexibility, and it's something that shouldn't be taken away from them.

7) Other bells and whistles

Last year, it was NFC. This year, it's IR blasters. And there's even more for software, from voice assistants to gestures to faux multitasking. Then you also have the specialized bells and whistles... your S pens, fingerprint scanners, and IP67 water resistance. With differentiation and improvements in the other aspects now harder than ever, expect most of the innovations to come in the form of even more bells and whistles, specific features that set a handset apart from otherwise identical slabs of gargantuan-screened, high-megapixel, multi-core, and pixel-dense freaks of nature.


Barring a surprise in the bells and whistles department (like flexible screens or holograms), the innovation roadmap for smartphone hardware seems to be down to battery life and camera quality. These two specs are important, but they might not be eye-catching enough to convince consumers to part with their money. Even this year, upgrades are starting to slow down since the previous year's flagships are still way above the 'good enough' category. That's why the talk on wearable computing is getting louder and louder. It seems like the smartphone market is now poised to be as boring as the PC market before long. That's why gadget manufacturers are scrambling for the next hundred-billion dollar market. Everyone wants the answer to the question: after smartphones, what will be the next big thing?

Tuesday, August 6, 2013

Goodbye iPad, Hello Macbook Air



I used to be an Apple hater. Of the few Apple products I owned in the mid-2000s - 2 iPods, 1 Mac Mini - none of them had worked properly. Syncing music to my iPod was a pain, and my Mac Mini barely lasted a year before breaking down. In 2011, after much hesitation, I made a gamble and bought the 32 GB iPad 2. Contrary to my previous experiences with Apple, it surprised and delighted me in so many ways. It's the gift that kept on giving. It's the best device I've ever owned.

What miracle transpired during the second half of the 2000s which caused the massive stride forward? Well, iOS happened. Until now, it's the most intuitive OS in the market. Idiot-proof and lightweight, it made interacting with devices a joy. The iOS App Store was a holy-grail milestone for Apple, where it found the perfect balance between the divergent approaches of walled-garden and openness.

When I got my iPad in late 2011, I made an experiment. I would live a PC-free existence and take on the post-PC lifestyle. It went so well that I questioned whether I would ever need a PC again. I read news and articles on Flipboard, then saved them to Pocket and Evernote. All my emails became instant and accessible through the Mail app. To my delight, most video websites also supported iPads, giving me an abundant trove of shows and movies to watch on top of the ones I could watch within apps. I downloaded the latest games, mostly just to see the mechanics and while away my free time. I could do almost everything I needed on my iPad save for file organization.

To further deepen the usage of my iPad, I got two new accessories, a stylus and a keyboard. The stylus allowed me to take full advantage of Paper and revive my sketching habits. The keyboard transformed my iPad into a mini-touchscreen laptop. It served as an excellent stand and case while empowering me to write notes and articles anywhere. In a previous post, I raved about how the iPad plus keyboard combo was the ideal computing solution for me. And it really was, at that time.

The wonderful thing about my experiment was that it fully immersed me in the world of apps and easy computing. It made me want to take part in the revolution. While I was happily living my post-PC life, I knew I eventually needed something more powerful to go into hardcore computing. In the last few months of my experiment, I waded the PC waters to see what could be good buys. The Lenovo Yoga convertible laptop came the closest, but Windows 8 really put me off no matter how much I tried to like it. On the other hand, the Macintosh was becoming ever more convincing. The non-intuitive components of my old Mac Mini were replaced with iOS elements, making Mac OS almost like an iOS-plus of sorts. I grew even more convinced to make the jump after seeing firsthand me and my brother's respective Acer laptops die one after the other while my dad's Macbook continued to hum along. And the best part, the price points of Apple laptops were comparable with their Windows counterparts even as it provided a superior form factor and of course bragging rights.

When the new Macbook Air came out in the middle of this year with battery life (one of my most prized features) even longer than my iPad, I knew it was finally time to pull the trigger. Last week, the stars finally aligned. I had acquired enough savings, the gall to flush it all down the drain, and the chance to buy my most coveted gadget - the 2013 Macbook Air.

Shortly after getting my new laptop, I got an offer from my cousin to buy my iPad. I needed to make the decision fast since I didn't have much time at home. The iPad was at the center of my gadget mix, the device I spent the most time on. Originally, my plan was to slowly ween off my iPad dependence by having both iPad and Macbook with me at the same time. I can use one or the other depending on the task at hand.  But I only have so many hours to spend on my gadgets, the Macbook Air is going to have to be at the center now. As a bonus, it's going to force me to create more rather than to consume more.  I knew that in between my smartphone, my Kindle, and my new laptop, the iPad is a nice-to-have yet ultimately redundant device. I decided to make another experiment. After 1 year and 9 months, I will try to live an iPad-less life.

Today is Day 3 of the experiment. I knew it was going to be hard to adjust without my regular companion for almost two years. Coming from the iPad, the Macbook Air has its share of good and bad. I am elated by the fact that I now own the best laptop in the world (according to most reviews). The blade-like device gives me a sense of power, as though it could slash through any task I ask it to do. The trackpad is surgically-precise; it's so good that I hardly miss my iPad's touchscreen. Reloading tabs are no longer a problem. I can finally use USBs again. Multitasking is a breeze, yet monotasking is also easy with full-screen apps. I could have my cake and eat it too.

Nevetheless, there were things I still missed a lot about my iPad. The Air is a very handsome device, but it borders on intimidating sometimes. Unlike with my iPad, I feel that I have to be very careful when handling it. The general playfulness of iOS is also replaced by the cold brushed metal of Mac OS. The App Store here looks like a sparse high-end department store compared to iOS which looks like a bustling shopping mall. I miss the handy portrait form factor. I miss all the fun games and the joy of downloading a new app. I miss Flipboard and Paper, two of my favorite apps which are close to impossible to replicate on the Air.

Out with the old and in with the new will always be a bittersweet experience. For now, I would have to split the load my iPad carried in the last two years. My Kindle will handle books and long-form content. My phone will do even more social updates and gaming. My laptop will take on web browsing, online videos, writing, and all the wonderful things my iPad couldn't do. It's going to take some getting used to, but I believe it would be a big step forward in terms of my workflow. I'm still planning on fulfilling my ultimate gadget mix with a Retina iPad Mini in the near future, but for now, it's goodbye iPad, and hello Macbook Air.

Monday, July 15, 2013

Disconnect



It's really funny how the world works. While the internet is now a commodity most of us can access, disconnecting from it has become a luxury few of us could afford.

The rise of social media is the biggest culprit. We have become hooked to the rush of dopamines from every newsfeed refresh. We have grown afraid to miss any updates. Everyday we allow ourselves to be bombarded.

For me, it usually starts with a visit to my Facebook newsfeed:

...someone's wedding picture
...an NBA highlights video
...the new review of Macbook Air
...some weird story about a dancing cat from Yahoo
...news posts about the stock market slump
...open other interesting things in new tabs.

Right when I'm about to go over to the tabs, new comments on the photo I just posted pop up as notifications. A few minutes later, somebody messages me. I post a short reply, then go back to my newsfeed. I click on a college friend's vacation album. A parade of likes.

Everyday, an accumulation of these activities from one visit to Facebook can already eat up close to an hour. Since most of us access Facebook multiple times a day everyday, we're looking at a significant chunk of time here.

It's not just an issue of time. Recent psychological studies have shown that seeing updates from friends often cause more pain than pleasure, because in the social media world all we see is everybody's best foot forward. A paraphrased quote from Psychology Today: "In the social media world, we're comparing our real selves to others' advertisements of themselves." Facebook is digital mascochism, yet it weirdly feels good. Worse, it's not enough for most of us. The more we update, the more we want to keep updating.

After my Facebook feed is exhausted, I go and check Twitter. Even faster real-time updates. Open even more new tabs. So many interesting things from people I follow. The feed is endless, so I move on.

Next stop, Tumblr. Browse images and gifs quickly. Like and open new tabs again. My brain is starting to hurt from the deluge, yet I cannot stop. Type 'linkedin.com' on my browser. The semblance of work somehow lessens the guilt. View top posts from the influencers. So-and-so has a new job. Like some posts. Read some articles.

By then, my brain has turned into a chaotic mush. I'm running out of websites to go. Yet I still. cannot. stop.

From the corner of my eye, the green blinking light of my phone invites me to pick it up. It's estimated that smartphone users on average check their phones 150 times a day. Seems like I'm no different, as I grab mine without hesitation. Open Instagram. New followers, new photos. More liking. Oh, somebody messaged on Wechat. There's an alert saying my Wechat feed has new updates too. Browse this first before replying to the message. While I type my reply, my phone vibrates again. Messages from Whatsapp and Viber.

I want to reply now, but I wonder what happened to the Facebook world since I left it 30 minutes ago. Type f-a-...

I repeat the cycle. Many times a day, everyday.

It's ironic how being so connected is starting to make us disconnected. After browsing our newsfeeds, what do we have left? Although we hardly notice, most of the small updates we come across everyday end up as meaningless chunks in our brain, too small to produce insight yet big enough to take up cognitive load. They severely shorten our attention spans and even dull our memories. Using social networks is indeed a perfect example of high in-moment happiness but little remembered happiness. The small bits of information mostly vanish, like grains of sand which all just slide from our fingers.

What our minds need to build on are boulders of information, big blocks which have staying power - movies, books, online courses, creative projects. But since they are a lot more difficult to take in compared to social media updates, the only way to engage with these longer form content is to disconnect once in a while. To reverse the inertia of constant updating.

I admit that it's very hard to do. Social media feels good precisely because it's the easiest way to feed our thirst for the new. We're naturally curious to discover the latest and greatest, especially about groups of people we care about. For the first time ever, social media afford us that privilege. Anytime, anywhere, always just a few clicks away.

There's no escaping social media; it's the defining phenomenon of our time. I still think that a world with social media is at least slightly better off than a world without it. As with most things, moderation is the way forward. In my case, I have no choice during the weekdays since my job is tied to it. But during weekends, I always try to resist the temptation of opening my whatever feed, and instead engage in long-form activities like reading a book or watching a movie. And what a pleasure it is. It's so liberating not having to click 'Like' on yet another senseless meme, or getting tempted to refresh the newsfeed for the 30th time.

Being offline frees me up to meet people, shop, do some sports, and collect my thoughts. I'm constantly surprised at how much I can do simply by disconnecting. In fact, I feel like I'm more productive during weekends than on weekdays when I'm hyperconnected. Disconnecting enables me to think real thoughts, post them in blog entries, and come to terms with an oft-forgotten paradox: To disconnect is often the best way to truly connect.

Friday, June 28, 2013

Crisis of Innovation

The digital revolution is diverting talent and attention from other industries that are ripe for innovation. Transportation is one good example. Significant progress hasn't been made for automobiles, airlines, and space travel for the past 3 decades. Pharmaceuticals are moving at a glacial pace. Cosmetics are no different. New architecture ideas are formed, but our construction materials remain the same. There's been a lot of talk in energy, but little action and even less traction. Venture capitalist Peter Thiel has said that we are not as innovative as the previous generations, contrary to what's being paraded in media. Our best minds are simply figuring out better ways to sell ads online. Or making phones thinner. Or cramming ideas  into 140 characters. We need more moonshots. The recent ones have been encouraging to see, particularly Google's self-driving cars and Project Loon. Entrepreneurs like Elon Musk understand this crisis, so even though he made most of his riches in Paypal, he's looking to disrupt transportation and energy. Digital is still a goldmine, it will continue to be, especially in its cross-pollination with traditional industries. But there's also a lot of gold to be mined in disrupting fields like construction, coming up with the new type of material for efficient building, or biotechnology in the form of artificial organs, or aeronautics through faster ways to fly. The digital space has enough innovators. There will eventually be diminishing returns on the talent coming in. It need not be so crowded.

Thursday, June 20, 2013

On Innovation

I’m tired of seeing people putting the innovation label everywhere. The latest ‘in thing’, the latest buzzword - O2O, mobile, social, digital, smart devices, quantified self, *insert sexy term coined by expert here… Yeah sure, going into these things will instantly make you more ‘innovative.’

Bullshit.

Being new is not all it takes to be innovative. It’s not about shiny gadgets or bigger screens or higher megapixel cameras. Anyone can come up with something new. Not to discount the effort that went into developing them, but these are gimmicks, these are fads. These things actually run opposite to true innovation.

Innovation has to genuinely elevate people’s quality of life. Some are great and long-lasting: the wheel, the light bulb, the printing press, the automobile. Some are incremental: lighter tablets, higher-resolution displays. It’s things that make life that tiny bit easier and more empowering for its users.

What separates innovations from the wannabes is not how technically impressive the achievement is. It’s about answering this one fundamental question - why does it need to exist? If the answer is an improved way to amplify humanity, then yes. If it’s just to increase revenues or to innovate for ‘innovation's' sake *cough Samsung*, then no.

Sunday, May 19, 2013

What is Good Design?

Good design starts with an objective, an answer for a ‘job to be done’. Finding this raison d’etre is the most important step, the core around which the entire process will revolve. Once set, you gather everything in nature and in your consciousness that relates to the objective, and start building.

When you're done, tear the whole thing apart. Take only the parts that are aligned to the core, those that have a good answer for every why. Build it back. Repeat the process until you are at the point where the difference between the whole and the sum of its parts is the greatest.
 
This applies to any man-made work. A gadget, a poem, a meal, a novel, a building, a painting… The process is easy. Gathering the materials is easy. Even setting the objective is fairly easy. What is difficult is knowing when to stop in the process of creative destruction.

Stop too early and you might forget some polish. Stop too late and you will have created extra appendages. Good design is organic, it's alive. You can just feel it when something is well designed. The designer stopped at the perfect moment.

Sunday, March 31, 2013

Breaking Free from Glowing Rectangles




The world has shriveled to a mass of glowing rectangles, and we are all prisoners.

Before the onset of digital technologies, we used to consume content in different layout sizes and materials. We read news on rough gray broadsheets, and content related to our interests on glossy magazines. We dove into stories on paperback books the size of our palms, and into nuggets of human knowledge on encyclopedias only slightly bigger. We flipped dusty photo albums open to take trips down memory lane, and drove to the theater to savor the latest blockbusters. In the good old days, we had all kinds of containers to house different kinds of content.

The arrival of digital technologies was supposed to be a cause for celebration. Finally, we didn’t have to carry so many things anymore. All our media, from news to movies to songs and books, could fit in one device. No matter what kind of media we fancied, we could find them within the rectangular windows of our PCs. As screens grew more and more powerful, they gobbled up many standalone devices along the way. Magazines, calculators, gaming devices, music players, cameras… they all disappeared into the omnipotent rectangle that, over time, shrank into our pockets.

The newfound convenience however came at the expense of a more immersive ‘content’ experience. Everything was locked behind glowing windows. To fit the screens, our content had to adapt in the form of apps and mobile websites. Even hardware design had to conform to the all-powerful screens. With the rise of tablets, we got a lot more screen sizes, but all that meant was a further spread of the 'glowing rectangle' epidemic.

The main problem which persisted is that the ‘all-in-one’-ification of our content and media made us numb to a change of activity and media form. Because we could do all kinds of things from playing games to watching movies to talking with friends on a single device such as an iPad, our experience with content lost its texture. Because we could access everything with mere clicks and swipes, we lost boundaries between our activities. We regressed from the time when a change in activity meant a change of device or even location, and therefore a palpable change in mood and mindset. Digital technology actually turned out to be a step back from the glorious era when the size, shape, texture, and even smell of the material housing our content added so much to the experience.

Skeuomorphism stepped in as a half-hearted attempt to add texture and context to our smooth, boring rectangles, but it was heavily criticized and rightly so. Standalone devices like e-book readers also sought to bring analogization back, but instead people saw them as incapable one-trick ponies.

The tech industry had little else up its sleeve. In recent years, we simply got more and more entrenched into the world of screens. Innovation came down to thinner bodies, bigger screens, higher resolutions, and faster processors. It's starting to get boring.

The only way to make tech exciting again is to liberate our content from their rectangular prisons. The obvious solution is beyond-the-screen technologies. A TED talk a few years ago introduced the world to augmented-reality projections (Sixth Sense). It was a revolutionary way of projecting any kind of content onto any surface, a sort of Midas touch that turned analog lead into digital gold.   We're still quite far off from this vision, but we are beginning to see technologies trying to break free from the glowing rectangle in the form of wearable devices (Google Glass), virtual reality (Oculus Rift), haptic feedback (Tactus Technology), and flexible screens (Youm Display). It’s also promising to see beyond-touch input interfaces like gesture control (Kinect and Leap Motion), voice control (Siri), muscle tracking (Myo from Thalmic Labs) and eye-tracking (Smart Stay). The evolution of these technologies would not only reformat our content as they were meant to exist, it would make them more immersive than ever.

The ideal picture is a mix of both worlds - the flexibility and variability of analog surfaces coupled with the power and speed of digital interfaces. When that day comes, content will no longer conform to a limited array of devices. Instead, it will be the entire physical world that will conform to them. It's a future where we no longer have to be prisoners of our screens, and it can't arrive fast enough.