Netizen, Choose Your Ecosystem!


Let’s face it, the time has finally come.  The “cloud” is here and for the most part it works wonderfully, but there are strings attached, and unfortunately you have to pick sides.  Okay, you don’t have to pick sides.  You could just take whatever comes your way and make decisions as you go, but that would unnecessarily complicate your life. Technology is designed to do exactly the opposite.  However, all technology, from the earliest stone tools to the latest gadget, requires some thought and training to be used properly.*  With the ubiquity of cloud services and mobile devices (currently in the U.S., roughly half the population has smartphones, and tablets are selling like hotcakes), a plethora of options have surfaced that were not there before, run by both big guys and up-starts alike.

The cloud means a lot of things to a lot of people, so let’s clarify what I’m talking about here.  By “the cloud” I mean that set of services that a “large” portion of device users engage with on a “regular” basis.  In other words, the minimum set of services that a provider must offer that work well across devices, and which are well-integrated both with each other and the devices.  Sound vague?  Well, let’s make it concrete.  Here are some “must-have” services that I think most users would agree they would like or need:

  • email and messaging [Ap, F, G, M, Y]
  • calendar and address book [Ap, F, G, M, Y]
  • web search [G, M, Y]
  • product search [Am, G, M, Y]
  • news [G, M, Y]
  • voice and video calling [Ap, G, M, Y]
  • social networking [F, G]
  • maps, directions, and local search [Ap, G, M, Y]
  • photo/video/file sharing [F, G, Y]
  • buy/rent and consume media (movies, books, music) [Am, Ap, G]
  • document, spreadsheet & preso editing and sharing [G, M]
  • backing up your files [Ap, G, M]

Those funny colored letters in brackets are shorthand for the “big six” technology companies – the heavy hitters in the world of offering internet based services to the public: Amazon, Apple, Facebook, Google, Microsoft, and Yahoo.**  I’ve noted which of these companies currently has a strong offering for each of these services.  While we could debate my definition of “strong offering,” it is still instructive to scan down the list to see which companies have the most comprehensive portfolio. The order looks something like: Google, Microsoft, Yahoo, Apple, Facebook, Amazon.  This could change with time, but that’s how things stand now.***

There are other important factors aside from feature sets that revolve more around the company itself.  Some I can think of include:

  • someone you can trust. [Am, Ap, G, M, Y]
  • someone who lets you own your data. [G, M?, Y?]
  • someone who takes security and privacy seriously. [Am, Ap, F?, G, M, Y]
  • someone who’ll be around for a long time. [Am, Ap, F, G, M, Y?]
  • someone who has their own OS and devices. [Am, Ap, G, M]
  • someone who has a good track record of things “just working.” [Am, Ap, F, G]

Personally, taking all of the above factors into consideration, I’ve decided to “go Google,” to use the marketing phrase that basically means moving all of your usage of online service applications (like word processing, email, social networking, etc.) to Google’s cloud based systems.  In full disclosure, last summer I took a job at Google (and so far it’s been great), so I’ve already gone to Google, but now I’m “going Google.” Although the former did accelerate the latter, I was already leaning that way to begin with, so it likely would have happened anyway.   In fact, it’s more likely that I chose to work at Google because it is the best choice in public cloud services. For me, it’s pretty much a no-brainer – Google  has all of the services and all of the right company characteristics.   I’m not the only one coming to these two conclusions – technology writer John Battelle has sided with Google in what he calls the “cloud commit conundrum,” and renowned inventor Ray Kurzweil has recently joined Google.

Like this angry guy, you may resist the necessity to choose an ecosystem, but the reality is that as a netizen, living in the cloud and trusting someone with your data will only become more and more inevitable over time.  Just ask Bruce Sterling.  As John Battelle points out, by just buying a device, you are already implicitly partially committing to one of the players. You could spread your data across providers to hedge your bets, i.e., not put all your data in one “basket.”  But I would argue that you will probably end up paying for it in the long run with painful migrations or complete loss of data.  Perhaps that is a price you are willing to pay to prevent one company from knowing “too much” about you.

Keep in mind you’ll also be paying another cost – lost opportunity for truly integrated and personalized services.  Amazing things like swapping out devices, and having everything just work – all of your stuff, your preferences, your user model, your personal assistant, will just be there and work. As an example, I’ve drafted this article over the course of several months, seamlessly using six different devices (2 Macs, a PC, iPad, iPhone, and Nexus tablet) and Google Drive apps.  If I had just been limited to using one of those devices, I’d never have finished.  As a second example, after recently getting a Nexus 7 tablet, I’ve been able to experience a great new product called Google Now – a service that automatically makes relevant suggestions for you personally.  When I first brought it up, it already had a suggestion on how to navigate to a restaurant I had searched for earlier that day from my iPhone – how cool is that?  If I had been using a different search engine, that never would have happened.

Does this mean I’ll stop using the other 5 “big guys?”  No, of course not.  Not only do they provide some services I can’t get at Google (yet!), but as I’ve explained in the past, considering the business I’m in, I can’t afford to ignore the competition.  But it does mean that I’ll limit my use of their services where they overlap with Google’s, which is in a lot of places.

It’s time to choose your ecosystem – what factors will you consider?

* I visualize a prehistoric parent carefully demonstrating how you never cut towards yourself with a stone implement.

** You could throw in a few others into the mix here, like AOL, eBay, LinkedIn, and Twitter, but their service portfolios are not at the same level as the others (yet!).  As an aside, I am amazed that when other people make similar lists, they exclude Yahoo.  Its offerings are far too broad and user base far too large to ignore, regardless of its mediocre track record.  And with a new CEO at the helm, things could easily turn around for them.

*** In fact, you could look at the places where companies have gaps in this list and the next to see where they might be headed.


The Perfect Happiness Storm


Singing in the RainThis year, instead of my traditional review of TED talks, I’m going to link to a single page on the TED site – but first let me explain why.

As I grow older, my need for gifts at Christmas diminishes, but this year my wife hit the nail on the head when she gave me a small pocket book titled “Be Happy.”  Each page has a one line aphorism like “Get a good night’s sleep” or “Keep learning” or “Don’t isolate,” along with a simple cartoon illustration.  In 60 lines of text and less than 300 words, it pretty much condenses all you need to know about how to achieve happiness with about 95% accuracy.

The reason it was so apt is that I’ve come to realize that finding the key to happiness has been at the core of my being my entire adult life, and with my wife’s help, I’m just now beginning to really get what it means to be happy. Although I’m still not all the way there yet, at least I feel like there is a path for me to follow that will take me progressively closer.  As I mentioned in my previous post, an analysis of CNN Money’s “Best Advice I Ever Got” leads to “follow your passion” as a very common piece of advice.  After doing that analysis, I started thinking about what I’m passionate about – what should I be “following”? I started listing things, and the one that resonated with me the most was “finding meaning and happiness, knowing myself.”

There seems to be an increasing interest in happiness, not only on a popular level (as one indicator, searches for “happiness” have been on the rise in the last few years), but also in terms of the scientific study of what makes us happy (see, for example, Happiness: No Longer the Dismal Science, or Maslow 2.0: A New and Improved Recipe for Happiness).

This year, I discovered a new feature that LinkedIn put together – soliciting The Biggest Ideas of 2013 from about 60 different influencers.  Many of the so-called big ideas are just whatever the author happens to be working on (i.e., they were using this feature as a self-promotion tool).  So in an attempt to discern the larger themes, I ran all of the text of these ideas through a textual analysis.  Naturally some terms like “social media” and “data” rose to the top as being important themes for this year’s big ideas.  But to my surprise, “happiness” was also relatively high on the list.

Last, but not least, Netflix recently recommended, and I happily watch The Happy Movie.  It was great.

So it’s all coming together – a book and a movie about happiness arrive at the same time that some scientists and influencers are focusing on happiness, while simultaneously and independently, I decide that happiness is my passion.  Naturally, I went to the TED site to see what those bright people might have to say about it, and you know what I found?  A whole collection of talks that TED had already organized around just this topic!  So, to start the new year off, here’s a dose of happiness talk, curated by the fine folks at TED.

As I’ve learned elsewhere, one of the components of happiness is sharing your passion with others, so I’ll try to do more of that here on this blog by following up on the themes I discover as I try to discover the secrets of happiness.

Happy New Year!



I Ride My Bike Every Day, Life is Good


This morning I got a little reminder of the wonderful gift of life.  I had to drop off a rental truck a few miles from my house, and needed to get home.  My usual ride (my wife) was busy, so I just took her “cruiser” bike, threw it in the back of the truck, and pedaled my way home after dropping the truck off.  It was a bit cold (in the high 30’s), but otherwise a nice day for a ride.  I must have been quite a sight, all bundled up against the cold, riding a too-small-for-me bright orange cruiser with a wicker basket on the front.  And a big-ass smile on my face.

Until a few years ago, I used to be a fairly avid cyclist.  A 20-mile quick training ride was the least I’d consider worthy of my time.  Unfortunately, a just-serious-enough knee injury has kept me out of the saddle, but as I dismounted this morning, I was reminded of those physical feelings I used to get.  The cold-numbed thighs and cheeks, the wobbly legs and sore backside.  It didn’t take much – just a few miles on a simple one-geared coaster-braked cruiser – to drop me right back into that euphoric state of mind.  I really missed riding.

I recently read a couple of articles on CNN Money in which they interviewed dozens of highly successful people and asked them to name the one piece of advice they had received that they considered most valuable.  Being a data scientist, I naturally viewed this as an opportunity to mine the answers for wisdom.  I ran the text of the answers through a simple auto-summarizer that I like to use (more about that in an upcoming blog post), and categorized the results.  The second-most important theme that emerged was the hackneyed, but nevertheless true-to-the-core aphorism to “follow your passion.”  (The most important theme?  I’ll also cover that in that other blog post.) Until this morning, I had forgotten about how much I loved riding a bike.  That passion had fallen by the wayside, much to my physical and, more importantly, mental detriment.

A few years back, my wife bought some blank notebooks.  They were a very simple design, with a string-clasp closure, and a cover with a line-drawing of a boy on a bike and the caption, “I ride my bike every day.  Life is good.”  I thought they were cute, but didn’t really get them until now.  The notebook covers, assuming you use them often enough, are a constant reminder to find those things which bring you joy, and to make doing them a habit, a part of your daily routine.  For me, that coincidentally happens to be riding a bike, and so I’ll be prioritizing replacing my 20-year-old road bike with something that is gentler on my aging body in the coming year.

I was originally going to title this post “Life: Love It or Leave It,” but decided that could easily be misinterpreted if taken literally.  But metaphorically, it is spot-on.  As we turn the page to a new year, it’s time to ask ourselves, “What is my ‘bike’?”


Synthesize, Prioritize, Evangelize – Thinking Big, Part 2


The Big IdeaGreat leaders think big. They take the seed of an idea and grow it into something larger than anyone else would have thought possible.  In my previous post “Summarize, Generalize, Hypothesize – Thinking Big Part 1,” I wrote about a few techniques for identifying great ideas, and turning small ideas into big ones.  In that article, I focused primarily on the nascent stages of idea development, of getting in the habit of mentally inducing something larger and more impactful.  That part of the process is naturally a personal, internal one – with the ultimate goal of becoming so ingrained that it happens automatically within a few seconds of hearing or thinking of an idea.  But great ideas are ones which also survive a vetting process where others who are likely to have valuable input, and who may also eventually be involved in implementing the idea, also contribute to the expansion and filtering of the details.

Ideas are a dime-a-dozen, which leads to the intriguing implication that most ideas are worthless, and a precious few represent the nuggets we all search for.  Because most ideas are free, they have a couple of interesting properties.  First, people will share them freely, and second, ideas are (or should be) easy to throw away.  These two properties naturally lead to the next two steps in ideation: Synthesize and Prioritize.  The third step, Evangelize, helps bridge the process of ideation that began with “Summarize, Generalize, Hypothesize” with the rubber-hits-the-road reality of execution, while simultaneously building the momentum and passion necessary for success.

Unlike the first three “internal” steps, these last three “external” steps are neither mutually exclusive nor discrete, nor even done “in serial.”  They can and should be practiced as a cycle that gets repeated a number of times, with each step running somewhat in parallel, each time increasing the size of the circle of people involved.  This is illustrated in this diagram:

Thinking Big Flow
Thinking Big Flow


This phase is all about getting others involved, a little at a time, to improve upon your core idea, flesh it out, or even help discover its true core.  You will quickly find out if the idea is no good, because either people will tell you so directly, or no one will want to follow up on it.  But if the idea  is great, you’ll get all kinds of feedback, volunteers, and other support you’ll need to build a foundation for execution.


Synthesis is simply the art and act of actively soliciting others’ ideas and rolling them together into a grand vision.  There are many ways to do this, from sharing it with a trusted partner, to formally brainstorming it with a group of colleagues, to building a full-fledged “business case” or doing a SWOT analysis or building a prototype, and pitching it to upper management or potential investors.  The point here is to start small, and increase the “size” (in terms of number and influence) of participants with each round of the cycle, so that the idea gets more solid with each review.  As an example, in this article by successful entrepreneur Steve Blank, he says “…my partner Ben’s office was the first place I would go when I thought I had new ‘insights.’ And we’d run them to the ground for days before we’d even let anyone else know.”

It is easy to get caught up in the excitement of potentialities when synthesizing, and forget that, most of the time, ideas don’t pan out.  That’s why prioritization should follow close on the heels of synthesis.


When he talks about innovation, Astro Teller (the head of Google X) always emphasizes the importance of brutal prioritization of ideas (See for example this summary: Some thoughts on innovation from Astro Teller).  Starting with the fundamental observation that the vast majority of ideas are bad (or at least not “big”), it follows that most ideas should be killed  as quickly as possible.  This should be done using the simplest reasons first – like “not physically possible” or “illegal or immoral.”  Some ideas that are not immediately killable should be explored in a brainstorming or conceptual “white paper” mode (as described above).  The few ideas that survive this should then be built into a prototype or “minimum viable product” (to use the language of Eric Ries’ “Lean Startup”), and then tested to see if they are actually good enough to iterate on and productionize.  The point here is that you should assume your ideas are bad until you can prove them otherwise.

To truly lead, though, your ideas must be big.  What makes an idea big?  Teller offers a simple checklist – consider these three things:

  • Is the problem it solves big?
  • Does it actually solve or significantly ameliorate the problem?
  • Do you have a reasonable starting point?  (e.g., sure it would be great to build a time machine, but there’s no reasonable place to start)

It pays to not get too attached to your ideas – chances are they won’t survive the Synthesize and Prioritize steps, at least not intact as they originally sprang forth from your head, and you have to be okay with that or you’ll never get to the point of executing.  And it also never hurts to be reminded during this process that less is more, and you should always try to boil the idea down to its core.*


When first synthesizing, you’ll naturally be sharing your idea with a small group of people.  But once the group has hardened the core of the idea and reduced it to its smallest coherent state, it’s time to expand that circle.  Always look to be adding new members to your “idea-backing group” that are from a diverse set, but especially go out of your way to include potential customers and potential backers.  Structure how you present the idea as a solicitation for help, not as an edict.  And involve the current supporters in your solicitation for new members.

As the number of people who know about the idea increases, some will naturally fall by the wayside, while others will hang on.  It’s important to monitor two things about this process.  First, the retention rate needs to be something significantly larger than zero.  In other words, if the overall size of the group of supporters doesn’t grow with time, that’s a possible sign that the idea doesn’t have what it takes to fly.  Second, the core supporters need to be identified and actively engaged, to keep the momentum building.

Some feel that evangelizing is an art form, but there is a science to it that you can, and should learn.  “Made to Stick” by Chip and Dan Heath provides an excellent framework around building up ways to make sure that others remember, understand, and even support and adopt your idea.

It may take one round, or dozens, of this cycle to get to the point where you have the will, the vision, the design, and the resources to begin execution, but rest assured that if the idea is truly big and great, it will survive the process and be ready for full-scale implementation.  Be careful, though – not everyone on the idea-backing team should also be on the implementation team.  Make sure that any backer you carry over has the requisite skills, knowledge, passion, and/or position of authority to be able to contribute.  Otherwise, as with the prioritization of ideas, you’ll need to ask them to step aside, even if only temporarily, until that point when they can contribute again.


*Greg McKeown had a great article recently in Harvard Business Review that makes this and other related points: The Disciplined Pursuit of Less


Summarize, Generalize, Hypothesize – Thinking Big, Part 1


The Big IdeaGreat leaders think big.  They can spot a great idea, develop it, and then make it happen.  Some people have a natural talent for seeing the larger potential of great ideas, of knowing which sparks are capable of being fanned into bonfires and which are destined to cool and die.  They innately take something and run with it, imagining all of the myriad possibilities and implications.  Their brain is wired to work that way.  However, this kind of “right-brained” thinking does not come easily to everyone.  There are a few techniques, though, that anyone can practice, and if converted into habits, are the first step towards thinking big and becoming a great leader.

Summarize, then Generalize

The first technique starts with the seed of an idea and then mechanically explores it in every direction.  Every time you come across an interesting idea when you read an article, have a stimulating conversation, consume high quality media, or use a cool piece technology, try this simple exercise:

  • First, summarize.  Create a “tweetable” headline that captures the core idea – the part that made you say “Hmmmmm…”  Make it short enough that someone gets the idea in 140 characters or less.  Really try to whittle it down to the fewest words possible without losing the essence.
  • Then, generalize.  Step through the words in the headline and insert, delete, or replace each salient word or phrase with related concepts, especially if they are broader than, reciprocal to, or parallel to the original.

As an example, take a look at this summary taken from the opening paragraph of a recent article in the Association for Computing Machinery’s monthly journal:

The annual ACM International Collegiate Programming Contest develops teamwork, skills, and algorithmic mastery.

If you believe this contest is a great idea, why not take it to the next level?  Instead of “annual”, why not bi-annual, or monthly, or weekly, or ongoing?  Why just collegiate – why not high school, why not elementary school, why have it limited to schools at all?  Instead of a single Programming Contest, why not make it into a suite of contests that test a variety of skills – an “Academic Olympics”?  Why limit it to just sponsorship by the ACM – maybe a consortium of professional groups, or some governmental agency?  Taken together, this great little idea could grow into:

The ongoing Dept. of Education International Open Academic Olympics develops teamwork, skills, and mastery of a wide range of skills, at all stages of life.

As with any brainstorming exercise, it’s important to not throw away ideas that seem too big, too hard, or are otherwise unsuitable too quickly.  Take note of them, come back to them later, and pick the one or two that really resonate.

Hypothesize – Ask “What if?”

The second technique* is naturally complementary to the first, and also relies on simple textual manipulation. If the parts of the headline that you changed are represented by BEFORE and AFTER, then list out all of the sentences of the form “What if instead of BEFORE, we had AFTER?”  In the example above, you might ask “What if instead of being annual contests, we had ongoing contests?” or “What if instead of programming contests, we had all kinds of academic contests?”

For each such question, mentally tick off all of the implications, both positive and negative, of living in a world where AFTER is true.  Don’t worry about how to make that world a reality yet.  Here you’re trying to suss out the benefits and drawbacks of the idea.  This is a prerequisite for determining whether you may even want to pursue the idea.  It doesn’t have to be a full-on analysis – you can always do that later on.

Making it a Habit

These techniques work best if you have internalized them and do them automatically, but they won’t come naturally to you at first.  To help you do that, let me propose a 30 day challenge.  For each of the four weeks in the challenge, pick one of the four modalities mentioned above (written material, conversations, media like TV/movies/radio/music/art, and technologies), and focus on that modality for the week.  Each day, write down the results of the three steps (summarize, generalize, hypothesize) for at least one idea.  This is a brainstorming exercise, so try not to self-edit.  At the end of four weeks you should have at least 28 ideas – spend the last couple of days polishing them and getting ready for the next steps, which I’ll talk about in a follow-up post: Synthesize, Prioritize, Evangelize.

* This second technique has its roots in a meetup I helped found.  It was originally suggested by a co-founder Jesse Bridgewater, with inspiration from the TED talk tryouts.  It’s a great way to have interesting conversations – ask each person that comes to your social gathering to bring one “What if…” – you’ll probably find you won’t even get to them all!


The 4 Principles of Great Leader-Chefs


I recently had the opportunity to watch a wonderful independent movie “Jiro Dreams of Sushi” – a must see for anyone interested in food, the artistic process, or leadership. It focuses on a Michelin-starred 80+ year old sushi chef who runs a tiny restaurant in the basement of an office building that people wait months for the pleasure of eating at.

One person that figures prominently in the documentary is a food critic who, when asked what makes Jiro such a masterful chef, produces a bullet list of reasons that I found intriguing.

Paraphrasing from memory, they were something like:

  • He has achieved a level of mastery, but is always trying to improve
  • He is a stickler for cleanliness
  • He knows what he wants
  • He is impatient

Even without watching the film, if you’ve seen any of the myriad of cooking shows running on cable these days, you’ll recognize this collection of personality traits as common to all great chefs. The reason I found the list intriguing is because of its general applicability to not just chefs, but leaders in any field. Take a moment and run any great leader you can think of through this checklist – Steve Jobs, Gandhi, Martin Luther King. They all exhibit these ideals.

I recently started reading “Unusually Excellent” by John Hamm. In his model, the first dimension of leadership is credibility, and the second is competence. Mastery of one’s domain is that unique blend of these two dimensions – a demonstration of competence that leads directly to credibility. A master chef could never ask or order a member of his crew to do some menial work without it being clear that he could just as easily accomplish it should the need arise. A lieutenant on the ground had better be able to aim and shoot. And a tech lead had better be able to design, code and debug a solution. But that isn’t enough. The true leader is always “sharpening the saw” as Stephen Covey said – not just practicing what he knows but also challenging the prevailing assumptions and trying to improve both his practice of his craft as well as the definition of the craft itself. In chefdom, you’ll see the great chefs exploring things like molecular gastronomy, hyper-local ingredients and other cutting edge themes. Mastery is not an end state – it is a state of mind.

In the kitchen, cleanliness isn’t just about avoiding contamination by other food or bacteria. It’s also about keeping your mis-en-place, or as I like to translate it “keeping your shit together.” To use a term I learned when training as a leader in the military, it’s the “attention to detail” that someone who has achieved mastery naturally uses to discover and discern the small problems that could grow into big ones if left unchecked. Hygiene is physical, mental, and emotional, and true leaders have internalized it as a habit.

When a sous chef has a question, have you ever seen a great chef stop and think? This is not to say leaders should not be thoughtful, but great leaders are not afraid to take action, and then correct course later if it turns out they were wrong. Of course, this relies on a certain level of competency to work, but true leaders are great because they know what they know, have a clear vision, and only stop to think when the decision being made represents a fresh challenge. This kind of long term thinking takes place before the heat of battle, and the leaders always enter into the fray with a clear vision of what it is they want to accomplish.

It’s often said that leaders remove roadblocks. If they had a lot of patience, they would simply wait for the roadblocks to clear themselves, which often happens. True leaders don’t wait. They push forward, they push their team to explore their limits, they are always striving for the next level. In the kitchen, if one team member is not keeping up at their station, the chef doesn’t slow down the whole line – he reallocates the team to get the line working smoothly again, and he does this without hesitation or apology. After the shift is over, he’ll address the underlying problem. If a new dish is not meeting expectations, he’ll pull it from the menu immediately, not wait until the shift ends or the supply of ingredients runs out. The true leader is always of the opinion “Why wait?”

If you are a leader or aspiring to be one – take these lessons from the master chefs to heart. If you are a follower, use them as a guide to evaluate your leaders. If they come up short, take action by embracing these principles yourself to become a leader, or find another leader who lives and breathes them.


That Personal Big Data is Mine



Like Robert X. Cringely, I have partial email archives going back as far as the late 1980’s.  And while that may seem amazing to some, the more amazing thing is that I can bring those data files up in almost any email client without even the hint of a hiccup.  I have even, on rare occasion, replied to a handful of these ancient messages, and received back replies.  And of course some of these ancient mails, like those to and from loved ones who have passed away, are priceless.  For the better part of its existence, though, my email has lived in “the cloud,” which is just the trendy way of saying the master copy of the data is stored and maintained by some company on their servers.  It is immensely useful, allowing cross-device access and syncing.  The funny thing is, though, that periodically I download a copy to my personal computer and back it up.  I’m sure I’m not the only person who engages in this behavior, and there are almost certainly even more who wish they had when they switch email providers.  It reflects the reality that I feel that this data is “mine” and that my provider is just providing a hosting service.  And because the email format is a well defined standard, my data is to a large extent portable – at least in the sense that I can take a copy and switch to using whatever service I want to, including hosting or writing one myself.

Email is one of the first, and simplest protocols on the Internet.  It is because of its simplicity (simple text messages are sent and received, in an easy to understand, text based format) that it is so durable.  Of course, the email protocol is limited to a point-to-point communications paradigm, so it doesn’t cover the full range of media types and actions necessary for storing and manipulating general purpose digital artifacts.  As the world went online in the 1990’s, the need for such protocols gave rise to such standards as HTML, HTTP, XML, XSL, etc. etc.  But all of these standards and technologies were primarily adopted by application service providers and hidden, for the most part, from the typical end user.

As an example, I’m writing this post using an app called Evernote.  It lets me do something similar to what I do with email, namely store and manipulate my data (in this case, notes) in the cloud from any device.  I can write one sentence on my tablet, review it on my phone, and continue editing later on a laptop.  Under the covers it might be storing this document as HTML, XML, or who knows what.  I hope that someday Evernote will at least allow me to export to a common format, but I have no guarantee of that.  To be fair, to a large extent the technical community has adopted a common set of data formats that make porting data between different operating systems and applications possible, if painful.  (Some time ago I read an article about the National Archives – how they have to constantly convert data from old media to new, and from old formats to new, to ensure they are still able to access the knowledge locked up in those digital bits.  This is a very expensive, time-consuming process.)

Yet, as cloud computing becomes more commonplace, it seems like we’re moving in the opposite direction, with more and more of our personal data living behind APIs in non-extractable, non-exportable, and fundamentally non-accessible formats.  The most personal of that personal data are the things we share on social networks – photos, comments, links – a virtual diary of our lives (which Facebook has so aptly capitalized on by turning it into an actual Timeline).  Another very important source of personal data are medical records, and the list goes on. Few will deny the importance and inevitability of the cloud as storage device for data.  But we need to make sure that we own our own data.  And as long as it is locked up behind APIs, or only exportable in byzantine, non-standard formats, then it’s not really ours.

That’s why we need an open protocol for social networking, similar to the one the Diaspora team has been working on for the last couple years.  But this needs to be adopted by all the big social networking players.  As I was writing this post, Chris Dixon summed it up best in a tweet: “There would be vastly more innovation and valuable companies created if micro-messaging and the social graph were open protocols.”  But I would go one step further, and say we need to actively seek out other areas of personal data that live in the cloud and make them portable by adopting standards and protocols for the most common use cases.  For example, there should be no reason that I can’t take my Facebook photos and decide to host them on Flickr or Picasa or my own home-brewed photo sharing system that I host myself, and then if I decide to, move them back to Facebook.

Because as we help ASPs build up their walled gardens, there has to come a point where they recognize the value we have created for them, and let us migrate elsewhere if we find another garden that suits us better, or even take up shop in our own backyard.  This will have the added benefit of incenting them to keep their gardens in order, and move us back to the open ecosystem first envisioned by the pioneers of the internet.




Beale was sweating. Not uncontrollably. Not so much that you could see it when he raised his tuxedo-clad arm to retrieve a serving tray from an overhead cupboard. But noticeably, and with a different kind of nervousness. Ironically, this should have been a piece of cake for him. He had the record time on the Gauntlet, after all. Now he just had to repeat something he’d already done flawlessly, as trained, for over a month.

That first day two years ago, he had arrived as instructed at the square, with high hopes of changing the world. A minute later, he stood befuddled, holding the rucksack his old friend – the one who had recruited him – had shoved into his hands, and contemplating the cryptic instructions she had breathlessly rattled off from rote before snatching his glasses and dashing away. “Start at Salsa and Seventh, then make a right somewhere between 4th and 5th. Oh, and you’d better eat before you start.” He learned later that last part was her own, unauthorized addition, and while it didn’t strictly give away much, it had made all the difference. Looking around, there had been only Joints in view. He couldn’t stomach printed food even on a regular day, and since this was apparently some kind of orienteering test, he really couldn’t afford to be dumbed down. Desperately, he opened the sack and found, unbelievably, a full, natural lunch, complete with what was obviously a tree-grown, organic apple. These guys must have a helluva budget, he thought as he sat down on some steps to eat and plan his strategy. Without the aid of his glasses, he had to use only his un-augmented brain and bio-senses to get to the rendezvous point. Luckily, he knew there was no street call Salsa, and with that as the starting point, he had unraveled the puzzle in less than 2 hours.

Now, stepping in front of the dumb-waiter, Beale paused for a millisecond before tripping the camouflaged button that would remarkably, silently switch out its contents, for the last time. Getting a design spec for this single slice of carrot cake had been hard, but as they say, you can find anything on the internet. Especially if you have the support and skills of a global terrorist network supporting you. Although it had taken some time, he had come to accept that what he was doing was indeed terrorism. Slow, subtle terrorism. Necessary terrorism for the good of mankind. But terrorism nonetheless. He liked to think that because he was now on an all-natural diet of precisely what his employer was supposed to be eating, swapped out from the dumbwaiter, that he now had the clarity of mind to especially appreciate the irony of poisoning a mogul over the course of two weeks using genetically therapeutic food manufactured by the same robotic printer/assembler the mogul had promulgated across the entire world. But truthfully, even the “dummies” that normally ate that crap could probably understand, and hopefully approve of, his actions. It was, after all, for their own good, even if they didn’t or couldn’t understand it yet.

“He’s on his last bite,” said the maid-servant as she raced past on her way back to the kitchen. That was the signal for dessert. According to Beale’s glasses, the mogul typically paused for 47.3 seconds between main course and dessert when lunching at home on Mondays by himself. But it also predicted that because of an appointment, he would be rushed today, and there was a 32% chance he would skip dessert altogether. Beale couldn’t afford for that to happen. The penultimate dose had to be delivered today for publicity reasons, and the mogul had evening plans that required his physical presence and wouldn’t be dining at home. When moguls died, authorities always did a thorough autopsy – a complete nanoscopic workup.  After all, they weren’t supposed to die, except in accidents. And if Beale’s counterpart had indeed infiltrated the coroner’s office as planned, the results wouldn’t be squirreled away, but broadcast to the world. In the best case, the “dummies” would rise up.  Beale was smart enough to know that was a very unlikely outcome. But the news of the death could lead to an influx of new recruits, and once the Movement had de-toxed them, they would be able to accomplish even more covert operations, and, he hoped, some day truly incite a revolution.

Reaching into the dumbwaiter, his gloved hand brushed the icing of the cake. Most moguls had a distinct lack of tolerance for imperfection, even though their food was paradoxically prepared using natural ingredients with an inconvenient tendency for just such imperfections. Beale’s mogul wasn’t quite so fussy, but he couldn’t take the chance of rejection, and hurriedly repaired the damage using his crumb sweeper. Normally, such an intimate interaction with food would elicit a visceral reaction, even mild salivation, but Beale knew how this “food” was made, and stoically placed the plate on the serving tray.

“No dessert today, Beale,” said the mogul as Beale approached, “Got a meeting, and want to take it upstairs.” He started pushing back his chair.

“But, sir.” Beale knew this was the wrong response. The right response came from the set, “Yes, sir”, “Very good, sir”, or “As you wish, sir.” But even with thirteen days of slow dosing, there was no guarantee the genetic therapy would take. This slice of cake had an especially high concentration of the triggering agent that put the odds of the mogul’s immune system failure at nearly 100%.

“Beg your pardon, sir. Just wanted to be sure your caloric intake was sufficient for peak performance at tonight’s Gala.”

The mogul thought for a second. “Good point, Beale. Wonder why my glasses didn’t point that out? Bring it up to the office, and I’ll try to squeeze in a few bites while on visual mute. Can’t guarantee anything though – those guys in Omaha usually let me do all the talking!”



(This piece was written as an entry to the Big Think Short Fiction Contest #1 : 1000 words of fiction around the theme “Future Food”)


The Tech of Trek


Why do geeks love Star Trek?  One reason could be that the technology it featured was actually much closer to becoming reality than the writers imagined.  Because the tech was imaginable, it was accessible, and made us all yearn for the day when it became reality.  Amazingly, that day is here in many cases.  I’m certainly not the first person to note this, but in gathering the below Iists, I’ve been truly amazed at the number of items that are either here already or aren’t that far away.  (For the Infographically inclined, here’s one person’s take).

Some Star Trek tech that is now a reality (I’ll include the modern equivalents):  Communicators (cell phones), Hypospray, Touch-based Tablets that use gestures, Memory crystals (memory sticks), Voice activated natural language queries (Siri), Telepresence (video chat, teleconferencing).

And if that isn’t amazing enough, here is some Star Trek tech that is on the near horizon (say, the next 5-20 years): Universal Translators, Holodecks (a combination of exoskeletons with virtual reality), Sensors (e.g., see recent articles on T-Rays), Visors (packing sensors and displays into a set of what I like to call iGlasses), Tricorders, Phasers, Replicators (3D printers), Transporters (I would argue that if you combine a Holodeck with the ability to control a remote humanoid robot, you have for all intents transported), Subspace radio (at very low bit rates via quantum entanglement), Non invasive surgery, Cloaking.

Some Star Trek tech I don’t see anytime soon (but hey, I could be wrong, and others disagree with me): Artificial gravity, Shields, Warp Drive, Tractor Beams.

Sure, some of the details of how this technology was realized is different, but the effective functionality is there or will be soon.  What amazes me the most isn’t that it’s a reality, but that we’ve gotten there so much faster than we expected.  If you are old enough to have seen the original series back in the 60’s or in reruns in the 70’s, did you honestly believe you’d see some of this stuff in your lifetime?

Which raises the question – what interesting and foreseeable tech have other SciFi writers predicted?  And shouldn’t we expect to see it much sooner than we think?

I’d love to hear your opinion and nominations for future tech in the comments below, via email, or as a reply via a social network…



Bespeak Skillfully and Carry a Nanoscopic Stick


When I was about 13 or so, my father read an article somewhere about the best careers for the future.  Near the top of the list of recommendations were computers.  My dad had the foresight to get me in front of a machine (a Commodore Pet) at work as soon as he could, and also bought a TI 99/4-A home computer for my brother and me to bang away on, which we summarily proceeded to do.  This is one reason why we both landed in Silicon Valley some 15-20 years later, and flourished.  I’ve often wondered, if that article were written today, what advice it would give to parents…

Knowledge of computers and programming is still a useful skill today, and will likely remain so for quite some time.  But those kinds of skills are just table stakes now.  So yes, make sure your kids learn something about the inner workings of computer systems, get them a solid foundation in science and engineering, absolutely have them master their math and statistics, and above all, prepare them for a lifetime of learning.  But to really set them up to succeed as adults, I’d also suggest something you may not have thought of: make sure they have a firm grounding in the liberal arts.  Because they will most likely live their life as “designers,” “makers,” and “bespeakers.”

Looking out 20 years is difficult in this age of accelerating change.  If you believe, as I do, many of the tenants of the Singularitarians, the year 2032 will be so utterly different than the present that it may seem daunting to try to predict a viable educational strategy for a young adult coming of age in those times.  In order to make my case for liberal arts, I therefore need to don my futurist cap and take a brief detour to describe the most probable state of the world in 20 years.

Some technological advancements that will likely happen by 2032 (hold on to your hat, this is a wild ride):

  • Robots will be ubiquitous, and will have taken over many of the skilled chores we now outsource to cheap labor markets (either at home or abroad), including producing our physical goods, growing our food, discovering and extracting natural resources, recycling our waste, and piloting our vehicles (although the need to transport both humans and goods will be significantly curtailed).
  • Low cost, local, on demand manufacturing will be commonplace.  Need a new gadget?  Just place your order, and it will be 3D-printed and assembled at a nearby convenience store for pickup or delivery in hours or minutes.  Some staples like disposable towels or razor blades or, well, staples may even be “printable” using an at-home “replicator” (chalk up another correct prediction to Star Trek!)
  • Sustainable energy will be ubiquitous and cheap.  Why?  Because we’ll be way past peak oil at that point, so we will have no choice but to solve the problem.
  • Many of our common gadgets today will be woefully obsolete.  Smartphone?  TV?  Camera?  All replaced by glasses/contacts/implants/neural interfaces that can beam images directly to the eyes or retina or neocortex, have an array of built-in sensors, and have access to 1000’s of times the computing power and storage available in today’s gadgets.
  • We will spend the vast majority of our waking hours (which will likely be most of our hours, as the need for sleep will have been largely eliminated) in either virtual reality or augmented reality.  These artificial worlds, populated by lifelike avatars of our own design that represent us, will seem as real as the real world, and will be just as important.  And we won’t be the only inhabitants – there will also be artificial intelligences to act as assistants, companions, and compadres.  And the augmented world will be populated by real-world objects that hook in to the global info net, using a variety of sensors to add to the unfathomable stream of messages on the global communications network.
  • Nanotechnology and biotechnology will have solved most, if not all, of the major sources of disease and illness, including aging.  People will begin to correct and augment their physical and mental abilities with technology, and true cyborgs and bionic humans will walk the streets.
  • Many mundane mental skills will effectively be outsourced, and thus would be “downloadable” on demand.  For example, real-time universal translation between any two languages, in any context, in any modality (verbal, written, Braille) will be ubiquitous.

All that in 20 years?  We’ll see, but even if we don’t quite make it to that point by then, we’ll at least be well on our way.  Of course, there are many other aspects to life on earth (politics, climate change, population growth or decline) that will define the culture and zeitgeist of the time, but one thing is for sure – it will be vastly different than how we live today. And keep in mind, even if things aren’t quite as I describe in 20 years, they will be at some point in your child’s lifetime, especially considering that they will have a very, very long life.

So what happens when physical items are commoditized, energy is cheap, everyone and everything around the world is connected at all times, manual labor is minimized, and it doesn’t pay to learn shallow mental skills?  Intellectual Property happens.  In this brave new world, knowledge will truly be power, and manifestations of thought will be the new currency.  Which brings me back around to the original topic of this article.  To be a part of this new economy, your children will need to be creators – people who use their brains to gather knowledge, synthesize it, and create something of value for others to use or experience.  And the artifacts that they directly create will be almost exclusively digital in nature.

It’s nearly impossible for me to know in any detail what exactly these jobs will be, but I can try to imagine a few.  For example, new kinds of musicians will come into existence, who will be more like composers, combining all sorts of music on the fly, in response to their live audience, and who use algorithms to adapt the music based on the “vibe” coming from their listeners – even if there’s only one person in the audience.  (Modern DJs and music-recommendation services both presage this concept).  New kinds of product designers will custom-build designs of a wide range of products for individuals, and then be able to turn around and re-sell those designs to wider audiences or negotiate higher prices with their patrons for design exclusivity rights.  Virtual world designers will be in high demand, and will need a wide range of skills from visual design to narrative skills (Modern video game designers are their precursors).  “Beamers” (as Ray Kurzweil calls them) will make their living by allowing their customers to live through them vicariously in ways that were never previously available, by literally streaming their sensory inputs in a way that makes the end-consumer feel as if they were inside the the beamer’s body (like in the film “Brainstorm”).  Nanotechnicians will design and build machines for specific purposes on the atomic level (like in the novel “The Diamond Age”).  Yes, it all sounds like something out of a SciFi novel.  In fact it sounds a lot like the civilization described by Arthur C. Clarke in 1956 in “The City and the Stars,” except this world is only decades away instead of a billion years in the future.

Some things have always been true about humans, and these will, of course, not change anytime soon.  People love a good story, they need social interaction, they enjoy beauty and elegance, they enjoy experiencing the world through their senses, and they have an overwhelming need to both learn and teach.  As our physical needs and desires become easier to satisfy, our emphasis will naturally shift towards producing other, less tangible things that fulfill these essential human needs.  And, more importantly, both the supply and demand will move into the “tail” of the distribution.  In other words, not only will people demand more customized, even bespoke, products and services, but by the laws of supply and demand there will naturally emerge more people to supply them.  Those will be our children – both the producers and consumers.

What skills would such designers, makers, and “bespeakers” need?  Clearly they’ll need a command of the tools – computers, nanotech, biotech, and their derivatives.  More importantly, though, they’ll need to know how to manage the creative process, relate to others, communicate their ideas, tell a good story, draw a good picture, relate everything to the rich history we have as humans, and build a philosophy of living in this changed world.  In short, they will need a solid grounding in the liberal arts.

A second renaissance is coming – will your child thrive?  Will you?


2012-01-19 P.S. This article was just published – my favorite quote: “being more fully human is what individuals will need to stay one step ahead of computers”.  The Career Of The Future Doesn’t Include A 20-Year Plan. It’s More Like Four. | Fast Company

2013-02-07 I just finished reading Daniel Pink’s A Whole New Mind, which comes to the same conclusions as I do here, but using different sources of evidence.  He also develops the types of “right-brain” skills that will be necessary, and ways to sharpen them.  Recommended.