Wednesday, May 01, 2013

Starbucks Pisses Me Off [and is forgiven]

I have been a VERY loyal Starbucks customer for many years, as anyone who knows me will attest. I say good things about them.  I am a brand ambassador for them.  I go to Starbucks every day, though I took a few months off for health reasons at the end of last year.

My drink for the past couple of years has been a "triple short no-whip mocha" which seems to fall into the cracks of their policies, because I like a small amount of milk, but it costs as much as a Venti with 3 shots.  I pay $4.65 for an 8-ounce coffee drink.  Every day.  By conservative estimate, I spend about $1500/year at Starbucks.

But that's not what I'm pissed about.  That's just background.

A few years back, I bought a Gold Card when they cost $25.  It gave me a 10% discount on all purchases.  I loved it.

Since then, I've watched as they changed the "rewards program" to reward me less and less for my loyalty.  First they got rid of the 10% discount in favor of a free drink every time you bought 10 drinks, but they mailed you a post card to redeem to freebie, knowing that few people would get it together to redeem the post cards. Then it was 12 drinks, now I think it is up to 13 drinks.  But it gets worse.

There now is some kind of minimum threshold you have to meet to maintain "gold status", which is 30 stars within some time frame.  As if somehow I am no longer a "gold customer", which by most retail standards I certainly am.  They have my history in their little computers, since I always use my card (now the app) for my purchases.

Here's what pisses me off.  Yesterday I was getting close to a free drink, I noticed in the little app.  Two more stars to go.  Woo hoo!  Today I bought a $4.65 mocha, a panini sandwich, and a water, and used my app to pay....

I got an incredibly unfriendly alert that popped up and said, "You have failed to meet the minimum criteria to maintain membership.  Your reward stars have been reset to 0."

I stared at it in disbelief.  This is how they reward their best customers!  Gamification is one thing, but actually penalizing me for not earning 30 stars in some arbitrary amount of time -- and how exactly am I supposed to do that if you reset my count to 0?

This is appalling to me, and actually made me upset, right there at the cash register.  I got totally, completely pissed off at Starbucks, and vowed to boycott them.

Is that what you want with your "Rewards" program, Starbucks?  To piss off one of your best customers with your little star program, to the point where he doesn't want to come back into your stores, and will be considering Peet's or some other worthy competitor from now on?

Congratulations to your brain-dead rewards marketing team for totally screwing up what once really did feel like a "gold" program, and made me happy to buy coffee at absurdly inflated prices.  No more.  No more.






[Epilogue/update: 5/2/13]

A friend happened to post something on Starbucks wall at almost the same time yesterday that I posted this, complaining in almost the same way about no longer wanting to remain loyal to Starbucks.  I put a comment on her post, mentioning this blog entry.  I think a Starbucks employee must have read my comment, and this blog post, because I got the below email today.  I am undecided whether or not this suggests great customer service and I am happy again (a possibility) or whether it is impersonal (no actual contact from customer service, just this email) and lame and would not have happened had I not complained publicly.  Whether or not I am Gold is somewhat beside the point -- they need to seriously revisit the reward system because it is not, in fact, set up to reward actual loyal customers; it's more like a video game where you can periodically get a "game over" screen and have to replay the whole level.  Who wants that in a coffee rewards system?

My Starbucks Rewards? Rewards Status Check Balance Reload a Card Send an eGift
Here's to another glowing year at Gold. Raise a mug to celebrate.
Thanks for staying. The 30 Stars you earned keeps you at the Gold level another year. Here's to another year of rewards galore.

You're on a roll, so keep earning those Stars. Another 30 within a year keeps you at Gold level for yet another year. We're hoping this will be a happily ever after type of thing.

[Epilogue/update: 5/22/13]

I've decided I forgive Starbucks, and have reloaded my card twice since posting this article.  The baristas are great, and overall it's a great company.

Tuesday, October 09, 2012

Tanks

P9223071P9223036P9223037P9223038P9223039P9223040
P9223041P9223042P9223043P9223044P9223045P9223046
P9223047P9223048P9223049P9223050P9223051P9223052
P9223053P9223054P9223055P9223056P9223057P9223058

Tanks, a set on Flickr.

I took these photos a while back when touring "Pony Tracks Ranch", the largest private collection of tanks and heavy artillery in the U.S., and maybe the world. It's now the Military Vehicle Technology Foundation.

Wednesday, August 01, 2012

The Science of the Full Moon

By the time I finish this blog post, it should be exactly a full moon: 8:28:30pm, according to my iPhone app.

I am a very scientific thinker, yet I believe in the power of the full moon.  Why?

Scientists observe things carefully and try to draw correlations and conclusions. Some of them we can prove, some we can't. But I am really good at noticing and recognizing patterns, and that's where science starts: observation and pattern matching.

It is my observation that people are weird around the full moon, and more passionate, and more impulsive, and more romantic.

Does the moon cause this?

Not directly, as in gravitational pull or tides or anything like that. But we all see and experience the moon, and it affects us, like daisies or sunshine or the ocean. In that sense, yes, the full moon affects human behavior and makes us slightly crazy ... in a good way.

I will drive up windy Highway 84 tonight and try to see the moon at every opportunity, and I will think of my Mom, in Maine, who just finished doing the very same thing.  She loves the full moon.


Wednesday, July 11, 2012

Beginnings of iMovie

I just ran across an email exchange between me and Steve Jobs from 1998, in which the very beginnings of iMovie are envisioned.  It was from a discussion before I was hired into Apple to actually build iMovie.
I was ridiculously long-winded, and Steve was very terse.  That's how it was. Many years later I saw him typing an email and understood why he was always so terse: he was a very slow, two-finger typist!  I know, hard to believe.
Here is the email, unedited.

-------------------------------------------------------------------------

From: Steve Jobs
Date: Mon,  8 Jun 98 12:32:41 -0700
To: Glenn Reid
Subject: Re: questions

I'd also love to just drag special effects on my video timeline and have
it know what to do.  Automatically find the nearby splice and install the
transition effect perfectly and all automatically.


Steve


X-Sender: rtbrain@206.184.139.133
In-Reply-To: <<9806060028.AA00332@ni-master.pixar.com>
Date: Fri, 5 Jun 1998 18:37:46 -0700
To: Steve Jobs <
From: Glenn Reid <
Subject: Re: questions

>I think Avid Cinema is the closest and best thing out there.  Can
>we do better?

Depends on what "better" means.  Obviously one could go way off
the deep end on features but not get it right (there are lots of
examples of that :-)

The biggest hurdle, I think, is conceptual: to make people believe
they can do this themselves and that it's not complicated.  Desktop
publishing was an easy leap for people because it was only a little
more complicated than a typewriter, which they understood.  By
contrast, 3D has never caught on because nobody thinks they can do
3D, and to a large extent, they're right.  Too damn complicated.

Video is right in between stovetop publishing and 3D: it's possible
to bring it to the masses, but it has to be dirt simple.  Avid
Cinema is close to being that.  I like their top-level approach, which
is four steps:  1) Storyboard, 2) Bring Video In, 3) Edit Movie,
4) Send Movie Out.  It doesn't get much simpler than that, unless
you get rid of Storyboard.  They have a fair number of editing
tools, a timeline, and other things that resemble video editors
more than they resemble stovetop publishing, and I'd be tempted
to simplify those even more.

What I think is needed is the "SimpleText" of video.  It doesn't need
to do much other than let you read in some movies, do some splicing
and editing, and write it back out.  Get rid of the dead time, the
places where you said stupid things right into the microphone, etc.,
and send it to grandma.

The hardest thing about editing video is finding the stuff you
want on a tape and getting rid of the stuff you don't want.  There's
no magic to that: it's just grunt work.  We might be able to do
some guesswork to find the transition points, but it would only
be guesswork.

The key will be to find a conceptual leap that thinks of video
differently than ever before.  If you've ever looked at the "Variations"
dialog in Photoshop; something like that.  Instead of giving you
a dialog box to adjust RGB (who knows if a picture needs "more red"
or "more magenta" by looking at it?) they show 10-12 variations
that would result from adding blue, magenta, etc, and you just click
on the one you like.  It then shifts your choice into the middle
and lets you keep clicking to improve the image by choosing the one
you like the best from the variations presented to you.  It's brilliant.

I think there's still some room for this king of conceptual leap
in video editing: something simple and elegant that makes finding
what you want easy.  The only thing that comes to mind right now
is a kind of "binary search".  You show two points in the video
and let them click somewhere between, as in "I think it's a little
after the birthday hat, but the backyard stuff was quite a bit
later":

   "birthday hat"                                          "back yard"

   |---------------X------------------------------------------------|


You click around where the X is and it narrows down the search again
and again, until you find the end point.  It could be quasi-animated
or aided by guesswork somehow.

Anyway, I'm digging in too deep, but I think there is room for
innovation and simplification.  I'm guessing, from talking with
Sina and Will Stein, that what you're after is the Democratization of
Video Editing, the simplest and most obvious tool yet for doing
basic editing.  It seems like it could be done.


Glenn

-------------------------------------------------------------------------


Thursday, February 16, 2012

Dumber and Dumber

I read today about "Mountain Lion", Apple's continued dumbing down of MacOS X.  I sat up a little straighter in my chair.

Everywhere I look, things are getting dumber.  A "smart phone", maybe, but it's really just a small, dumb laptop with a phone in it.

Our country is getting dumber -- our blockbuster IPO's are fluffy stuff like "social networking", the Super Bowl and Desperate Housewives are the pinnacle of entertainment, while science is being slowly but systematically bred out of our youth.

Nobody can spell or write well any more.  Our goods are not well made, sound bites have replaced thoughtful discourse, people say they want to "Fix America" when they really just want cheaper gas and more crap to watch on TV.

So-called Mountain Lion got me thinking about the Renaissance today, and how we are spiraling backwards by almost any cultural or intellectual measure.


It has been almost 500 years since Leonardo da Vinci died, at the age of 60.  He contributed more to the world in one short lifetime than the rest of the world's population has managed in the subsequent several hundred years.


What the hell is going on, and we aren't we doing anything about it?

Saturday, February 04, 2012

Social Networking and the River of Crap

Learn what is floating downstream in your personal River of Crap.

All the social networks now have at their core what I call the River of Crap.  That is the central news feed that is pulled from your contact list and what they choose to post or "share".  It is effectively a newspaper, where the writers and editors are your friends/contacts.  It really can be a river of crap, but luckily you can filter it. Some of it is advertising, just like a newspaper, and some of it is truly interesting and relevant.

I've been building, using, and thinking about social networking for a long time. At its core, it's not what we all think it is.  It is not a way to "stay in touch", it is not a "friends list", it is not a place to "upload photos"....

Facebook and Twitter and Google Plus are a form of personal branding: it has become how we tell the world who we are, and we do it by showing others the things we like.  A few decades ago we did this with bumper stickers and T-shirts and baseball caps with logos on them.  Now we do it by posting on a social network instead.  When you share a picture of Obama with words superimposed on it, you are making a statement about yourself.  When you Like a post that criticizes Susan G. Komen foundation for pulling funding from Planned Parenthood, you are essentially putting a bumper sticker on your virtual car.

Facebook and Twitter are also editorial services, where the editors are people you know.  The most common reason stated for not liking Facebook, or not wanting to participate, is the sense that you have to constantly read about "what other people had for breakfast."  While that's true sometimes, it misses the point.

I used to monitor Google News every day, and sometimes I still do check it.  But it is an automated filter, and it is not nearly effective as my own personal Rivers of Crap.  If anything interesting or important is going on, I read about it first on a social network.  Don't you?

There are slight (but important!) differences between Facebook, Twitter, and Google Plus when it comes to the River of Crap, and how it gets filtered.  This may well be the sorting algorithm by which the winner is chosen, in the long run.

Facebook has some hidden algorithm that decides what you should see.  It's complicated, and doesn't feature all your friends equally.  Everybody notices this, and nobody really likes it, but you can't quite tell what it's not showing you, so nobody complains all that much about it.  It is a filter, but you can't control it.  The remaining filter controls are all basically "hide" in one form or another.  If you don't like a post, or a person's posts, you can hide it, or them, from your Feed.  When I talk to people I realize that most Facebook users don't really use this.  They just accept the River of Crap for what it is, and use Facebook more or less based on that.  That is probably why Facebook tries to automate the filter on your behalf, because few people take control and do it themselves.

Twitter doesn't have filtering, they have Search.  Your River of Crap is just there, scrolling by, and perhaps because of the real-time focus and 140-character limit, people post a lot more frequently to Twitter than other networks.  It's a faster flowing River of Crap!  But to filter and decide what you want to read, you end up using #hashtag searches to follow topics.  Less good as a newspaper metaphor, but better for research, because you can find information posted by people whom you are not following.

Google Plus introduced Circles as a new way to aggregate the people you're trying to follow or pay attention to.  It is complicated, both in terms of posting (do people really make the decision to post to circles other than Public?) and in trying to use it to filter what you read.  And it combines the two-way nature of Friends lists (facebook) with the one-way nature of Following (twitter) but in doing so, it confuses most of us.  Frankly, the whole thing doesn't work very well yet as a social network.  But it is a more powerful mechanism in the long run.  Like many powerful mechanisms, if nobody uses the power, and people just stare at the River of Crap and decide whether to participate or not, it will likely not win hearts and minds.

What's interesting to me is that the people who build and run these services don't seem to understand what they have built. They don't offer Newspaper-like filtering, or topic-based viewing, or any other way to control the River of Crap.  They still think they're building networks of Friends.

Sunday, January 29, 2012

The Jackling House

I wonder if the people who prevented Steve Jobs from demolishing the Jackling house for so many years feel bad?  From reading their page, it sure doesn't look like it.

I mean, it got demolished anyway, but Steve died and didn't get to build his dream house.

Nice going, guys.

Monday, November 14, 2011

The Microblogger's Dilemma

Will every site soon have a "status update" field? How can I possibly update all of them?!

This has passed trend and is headed straight toward pandemic. I suppose because it's so easy to implement, there is a proliferation of "what are you doing now?" kinds of status update opportunities. Twitter. Facebook. LinkedIn. Google Plus. And that's just the biggest, most popular ones, and doesn't consider geolocation updates, which is a whole nother set of sites (FourSquare, etc).

The dilemma is ... which one do you update?  If only one or two interesting things happen to you in a day, or week, what site do you update? I solved this for a while by tying my Twitter account to LinkedIn and Facebook, but there is an unspoken disapproval of this leveraging of posting. You can't be a true Facebook devotee if you only post to Facebook via Twitter, right?  And how could you possibly post both to Google Plus and facebook?!  That's heresy!

[An aside to you Facebook devoteés ... yes, I know that they now like their name to be all lower case, but like Wall Street Journal editors, I refuse to follow all weird capitalization schemes, preferring to stick to my own journalistic standards].

If auto-reposting from Twitter to Facebook and LinkedIn is not cool, can I copy/paste the same thing to Facebook and Google Plus?  What if most of the people, like me, are members of all of them? Won't they see through my ruse, and discount my "interestingness" because I post the same thing to all of my microblogs?

If I post different things to each site, what does that mean?  Someone who is interested in what I have to say now has to look in 3 or 4 places?  What a waste of time for them, and for me.

And yet, if you join Facebook, or Twitter, or Google Plus, and neglect them, that's the worst of all, right?

The paradox I find greatest of all is the parallel proliferation of "get funding quick!" sites for angel and vc funding.  I have joined several recently, to look around, in pursuit of elusive angel funding. But if you run, say, angel.co, you don't want me to also be on growvc.com (never mind that the founders of one site may well be active on the other). I actually got an email from somebody at angel.co basically saying that I hadn't spent enough time on their site, or filled out enough data, or updated my status enough, and therefore I wasn't worthy of being recommended for investment. It should occur to them that the less time I have to update yet-another web site, the more likely it is that I'm doing actual work worthy of investment. Myopic.

I am posting this diatribe, er, open question, to (gasp!) blogger.com, where I maintain an old-fashioned blog from the middle of the last decade.  It is at least persistent through all these trends, and supports more than 140 characters.  I will, of course, post a link to this through bit.ly to all my microblogs -- and I will do little else. In order to feed the appetite of these microblogs, I must do that as rabidly as I once processed email (okay, okay, I still do rabidly process email).

I guess my point is ... are we being asked to declare our allegiances to particular vendors/technologies by where we choose to update our status?  What an odd result of a weird little microtrend.


Friday, November 11, 2011

Open Letter to Web Site Developers

Dear Web Site Developer (or misguided management):

Please don't do any of these things on your web sites:
  1. Ask me to enter my email address twice.
  2. Tell me what characters I can and can't use in my password.
  3. Time out my sessions for my own protection.
  4. Make me change my password every so often.
  5. Make every field in your form *required.
  6. Make it impossible for me to change my email address.
  7. Insist that I provide you with a security question and answer.
I know how to type my email address and I know more about how to create a secure password than you do, and I do not forget my passwords. You have meetings where you talk about "reducing friction" for people to join your sites. You create friction every time I log in, not just when I sign up.

If you are a bank, and your page times out after 5 minutes and I have to log in AGAIN, inside my highly secure physical location with no possible access to my computer by anyone but me ... are you protecting me, or irritating me?

Sincerely,
Glenn Reid

Tuesday, July 12, 2011

Expert Culture vs. Ease of Use

There is a phenomenon I call an "expert culture" where things that are hard to understand and use become popular precisely because they are hard to use. Once you figure out how to use something complicated, you become an "expert", and it feels good to be an expert. You help other people because it makes you feel smart, and then they learn, and then they are an expert too.

Conversely, products that are easy to use and have few unnecessary features are often dismissed as trivial or underpowered.

This is a fascinating and bizarre contrast, and it is very counter-intuitive. We are all led to believe that things that are Easy to Use get adopted, and complicated things are eschewed. There are many counterexamples to this, although Apple products are perhaps an existence proof that at least somebody buys Ease of Use.

This occurred to me as I was deleting some early posts on Google Plus that were open questions, trying to figure out how Google+ worked. Valid points, I felt, and reflective of a "newbie" experience on a new platform. I was deleting them because I felt foolish for having posted them, and I realized that Google Plus is an Expert culture, and facebook is "for the rest of us".

Circles alone, in Google+, are really complicated, even once you know how they work. Consider this graphic representation of the rules for who can see your post on Google+. If that's not an expert culture, I don't know what is. At least half the posts I have seen go by on Google+ are in fact about how to use Google+. That tells you something too.

[I started posting this on Blogger because it's essentially a blog post, and I may finish it there too. But I wanted to see if this medium could replace blogging completely. I don't think so, not quite yet. I don't have enough control, can't set a title, and I can't embed links and things like that. Maybe I'm just used to those things, and blogging shouldn't rely on them. We'll see.]

Sunday, July 03, 2011

JIT vs Buffering

I've been studying Just-In-Time techniques for business and manufacturing and process (or JIT, or the Toyota Production System). There are fascinating parallels between computer science, where JIT is also used to some extent, and manufacturing. I have some thoughts to add to this discussion, and no place to add them, so I'm blogging about it.

JIT has two things at its core: predictive forecasting, and signaling. Predictive forecasting is easy to understand: the more predictable your task is, the more lean your process can become, because you know what is going to happen. Signaling, or Kanban, is an abstraction for somebody raising their hand and shouting: "hey, we're out of rivets over here!" Both are necessary for JIT, and they are to some extent opposites of each other. If you know exactly what's going to happen, you should have no need for signaling.

The real world is, of course, somewhere in between, and the process you design is at the mercy of the business problem you're addressing. Toyota developed this process for manufacturing cars, where demand isn't completely predictable, but it has a little bit of built-in time padding -- a car does not need to be finished within 3 hours (approximate assembly time) of ordering to meet customer expectations.

Banking, on the other hand, is essentially real time. If a customer asks for their money, you have to give it to them. There is a complex JIT system behind banking that tries to predict this demand and provide the money "just in time" from thin reserves at branches and central banks, but a failure in predictive forecasting of JIT banking is considered bad: a "run on the bank."

I like looking at problems from both ends. If you consider real-time phenomena that don't quite work because of the inability to perfectly forecast demand, like banking, computer networking, restaurant management, and the flow of water and electricity, you see that buffering (inventory) is introduced to smooth out the flow and meet demand in real time.

In manufacturing and retail, where inventory and stock was presumed, the effort was the opposite: to remove the buffering and provide real-time supply: JIT manufacturing.

I believe that these are opposite sides of the same coin. An inventory buffer can reach zero, the effect being that a signal is thrown, and the process waits for inventory to be delivered. [It is worth noting that this is exactly how computer networking works, with signals and everything].

Here is my thesis, then:
All process has buffering (inventory) and variable demand. Optimization is the same in all scenarios: you are simply adjusting the size of the buffer.
This is commonplace in computer networking. You explicitly state the size of a buffer, and you can adjust it. 2k? 20k? 200k? Depends on what you're doing (chat vs downloading movies), and on, yes, how predictable demand is. Chat has low volume and a high degree of unpredictability; video download is completely predictable and high volume. The buffers are much larger for video download than for chat.

Let's consider other another example: a team of roofers putting new shingles on a building. There are several intriguing layers of process and signaling and buffering in this seemingly straightforward task.

First, the size of the roof is known, so it would seem that the raw materials can be supplied in exact quantities. But the roofers always order a little extra material -- more nails, shingles, and tar paper -- than is necessary. This is to allow for the unexpected (unpredictability in forecasting), including errors and damaged materials. The less waste the better, so the amount of extra material is adjusted for each job, trying to match the challenges of the roof geometry to the skills of the crew and the number of shingles per package. Can you do the job with nothing left over, and no trips to the hardware store?

Next there is the transportation issue. Some roofing contractors lift all the shingles up onto the roof ahead of the job, spacing them approximately according to coverage. This is an extreme of buffering in advance. Other contractors have somebody yell for more shingles as needed (signaling) and have somebody run up the ladder with another pack of shingles. Which is better? It depends on what you're optimizing. Pre-placing the shingles suggests that the application of shingles is more skilled labor, and more expensive, and should never have to wait for the [cheaper] labor to carry the shingles to them. But if the nailer yells before he/she is actually out of shingles, then the more skilled resource is never idled, and the effect is the same. Or more correctly, the optimization moves upstream, to the difference between serializing the tasks (putting all shingles on the roof before you start) and the cost differential between renting a boom lift truck vs a crew of people to carry shingles up the ladder.

The reason I thought of roofing as an example is that it has buffering in unexpected places. How many shingles does a roofer take out of the pack before walking over to the area currently being roofed? Buffering. How many nails does the roofer take in hand at once? Buffering.

The size of a buffer is often artificially constrained to be the size of a roofer's hand, or the available shelving space in a warehouse. To the extent that the "natural" buffer size that would best optimize the process does not match up with the actual size of the buffer available, you get inefficiency.

This mismatch between physical buffer capacity (tank size, shelf size, memory chip size) and the optimal buffer size for a given process is, I think, a very profound issue at the heart of many, many serious problems.


Sunday, June 05, 2011

Why Podcasting Never Really Happened

Back in the day, I predicted that podcasting was not really "a thing" and would go nowhere. Nobody really agreed with me, but I believe that now it is safe to declare podcasting officially dead. (Blogging is almost dead, too, so this posting is perhaps paradoxical).

Here is why podcasting (as an authoring paradigm) never happened: audio is the same as video. Except not as good.

Video and audio are both time-based media. In fact, audio is simply a subset of video. You can see this on YouTube by finding a song you like, posted with lame photographs layered on top of it as the "video" part.

But it is actually much harder to edit audio than to edit video, because there is nothing to look at when you're editing (almost nothing -- you can get waveforms that help a little, but they're pretty hard to use effectively). Video has cues and transition points and also audio, so it's just easier to edit. Period.

So, audio is less good, less interesting, and harder to produce and edit than video, and it takes just as long to consume. Why would it ever become a popular consumer authoring medium? Exactly.

It's actually easier to understand the value of black-and-white TV after seeing color TV than it is to understand the value of audio-only on your computer, or iPod. Podcasting failed, and instead, iPods now all have tiny video screens!

Tuesday, April 12, 2011

The Truth about Multitasking

I've been reading recently about scientists studying multitasking and declaring that we're no good at it. I have something to say about that.

First, some background. Multitasking is a word borrowed from computer science and reapplied to life, which doesn't happen that often. In a computer's operating system, at the very heart of it, is a switching mechanism for doing a little bit of work on a lot of running programs, in a "round robin" approach, so that each of them gets a little time slice. No one program gets to run unbridled for any length of time. Processors are so blindingly fast that you can't tell the difference. A typical CPU today can execute many billions of instructions per second, so the fact that it is switching back and forth between, say, your browser and your email program and updating the display is done so many, many times per second that you couldn't possibly perceive it.

But yes, the computer is multitasking. And guess what? It doesn't come for free.

To switch back and forth between the many processes running on your computer, the operating system does what is called a context switch. This is important, so pay attention. A context switch isn't just jumping around between processes. The CPU needs to also store the context from each process, so it can come back to it later. The context is as little as possible--a bunch of memory locations, a few details of what the processor was doing, and things like that. It's not unlike the folders on your desk that contain the context of your human tasks -- the bills that you might be needing to pay, or the phone number and resume of the job candidate you're about to call.

There are decades of research on operating systems that try to trim down how much context you need to save when switching between them. The less, the better, with a huge multiplier in efficiency for every little bit you don't have to store, because the CPU has to physically copy the information back and forth every time it does a context switch. So do you, by the way, when you multitask. Understanding this, and making your context storage small and efficient, helps a lot with multitasking efficiency. If you have to open up your Word document, or find the phone number, every time you get back around to a task, the startup cost is too great, and you spend all your time thrashing (a real computer science term, believe it or not) and not doing real work. It feels like real work, but if you're just opening and closing folders (contexts) you're not doing anything useful.

Now here's where it gets interesting.

Computers don't just switch evenly between all processes. They look around to see where time is best spent. For example, many processes are blocked on I/O (input/output). That means that they're waiting for a file to be opened from your disk, or something to come back from the "cloud" over the network, or any of various other wait states that are commonplace deep inside a computer. If a processed is blocked, you don't give it any time. Simple as that. More CPU time for the other processes.

See what I mean, about getting interesting?

In the real world, our I/O is blocked all the time, too. We're waiting for somebody to call us back. We're at a stop light. We're waiting for the DSL modem to reboot. We're standing in line at a store, or even worse, at the DMV.

Human beings are not stupid. We know, as a species, that we can use our blocked I/O time for something more valuable. It is why we text madly at stop lights, or talk on the phone at the DMV, or do things that seem anti-social to old people, but are good time management when done well.

When scientists study "multi-tasking" in the lab, they do it abstractly, with tasks that don't exist in the real world, and have very few wait states. So yes, it is less efficient to switch frequently between truly parallel tasks that have no wait states. You spend time context switching and lose efficiency. But that's not how the real world is, and that does not seem to be accounted for in these studies.

Consider the teenager listening to music, texting, talking on the phone, and doing homework. Crazy? Good multi-tasker? If you dissected this scenario carefully, you would see all kinds of wait states. Texting has big holes in the timeline, waiting for somebody to reply. Even talking on the phone is like that. You can tune out for a few seconds and the person you're talking to doesn't notice, or care. With some people, the talkers, you can help yourself to 30 seconds of not listening and they won't even notice. Listening to music can be a background task, requiring a different part of your brain than conscious thought. Gee, there actually is a fair amount of time left over for homework, as long as it's the kind of homework that doesn't require your frontal cortext (which is, alas, most of it): coloring in the graph for biology; searching for something on google; writing your name and the date at the top of each piece of paper.

In the study I linked at the beginning of this article, the "multi-tasking" was forced on people by interrupting them with annoying tasks, which did not allow them to make the context switch themselves, or save state. That's not a useful measure of multi-tasking, it's the reason we don't answer our phones and close the door at the office. Interruptions, I claim, are not a form of multi-tasking.

Anyway, I think it's time somebody did a study on human multi-tasking with the ideas of context switching and wait states deeply embedded in the study. I guarantee they would get very, very different results.

Friday, February 18, 2011

Cost cutting: rolls of toilet paper? Really?



It was bad enough when suddenly Cheerios boxes were thinner by 20% than they used to be. Same price, less content. There was a rash of that when the economic downturn really hit. I saw that it was probably necessary, but found it discouraging.

Today, I discovered that the same brand of toilet paper that I've been buying forever (Scott) has made their rolls 3/8" narrower than they used to be. That may not seem like a problem, except that the toilet paper holder at work is one of these designs:


And the roll simply doesn't fit! It falls right out of the holder:


Here's the difference in the old and new sizes. It's quite significant. In the words of Wayne Campbell: "Shyah, as if we wouldn't notice! "


Luckily, I have a machine shop, and I'm not afraid to use it. And I have some inventory of copper tubing of just the right diameter:


Wednesday, June 16, 2010

Inventor Labs blog

We started a product design-oriented blog over at Inventor Labs. I will continue to blog here (sporadically, as always) but may put more technical or product-related posts over there.

Sunday, April 18, 2010

Design Renaissance conference

I spent much of today at a conference in Santa Cruz, CA, on an incredibly beautiful day. I was inside much more than a sane person would have been. Santa Cruz is a special place: people care more about more things, per square centimeter, than almost anywhere else except maybe Berkeley.

This was a good conference, though it had less to do with Design or Sustainability than I would have expected. There was some of that, of course. But it was Politics in equal measure.

The best part of the entire day was Eric Corey Freed, Organic Architect. I'm sure he actually is an architect, but that was decidedly beside the point. The man gives an amazing presentation, right up there with Steve Jobs, except the subject matter is far more compelling than merely the next shiny computing device. If nothing else, Freed is a walking example of why PowerPoint should just be deleted. I'm not sure what presentation software he was using, but it was alive!

Here are some of the things I learned today:
  • There were more wind turbines in 1920 than today.
  • You are 8 times more likely to be killed by a cop than by a terrorist.
  • The average price of a home in Detroit right now is about $5,700.
  • There are 103,000 empty lots in Detroit (where once stood buildings)
  • The 1908 Model T got better gas mileage than the average 2008 figures.
  • The Environment: "too big to fail!"
  • Exxon alone spent more money lobbying Congress last year ($14.9M) than all CleanTech concerns put together.
  • Four times as many people (580M) voted in the American Idol contest than voted in the 2008 presidential election (129M).
  • The U.S. is indeed #1 in some important areas: obesity, crime, military spending, oil consumption, energy use...
I want to try to book Eric Corey Freed at a local peninsula event, if possible. He's quite thought-provoking.

Wednesday, April 14, 2010

Customer Service in the "facebook era"

I have some feedback for facebook that I think would be valuable to their product managers and software people. I have no way to get it to them. Their "customer service" is almost impenetrable. It is clearly designed for idiot prevention, and I can sort of understand that, with 100's of millions of customers who use the site for free.

However, I think that this doesn't serve them well. Because there are people like me out there, who know how software like this works, who might want to report a bug, or a design flaw, and help them out a little bit. There is absolutely no way to get through their Customer Feedback Prevention mechanism. Other companies are like this, too.

What I think would fix it is a way to say, in effect, "I promise that you will be okay with what I have to say." For example, I could check a box that said, "I authorize you to delete my facebook account and add me to a Russian spam list if you think I'm abusing this privilege". In return, my message should go to a *real* person, in a reasonably high-placed position, who might actually want to hear what I have to say. I know those people exist, because I've worked at places like facebook and the product managers and engineers and marketing people want to know as much as possible about their products and how they are received. It's just that there's no way for someone like me to reach them.

Why iPad will kill Kindle

I have a new iPad. I'm not usually an early adopter, partly because I've worked in the technology industry a long time and I wait a revision or two with most things. But I bought an iPad, partly because they're [relatively] cheap.

But this blog post is about reading. I have not done much reading in the past 20 years. I'm not sure why. I read sometimes on airplanes and on vacations, when I don't have my usual infrastructure around me. I buy books, and I love books, but I don't really read that much. I think it's because I'm so interested in so many things that I do things, instead of reading. I have a huge stack of books I'm going to read really soon. Except, of course, I don't.

So I bought an iPad but didn't think I would read books on it. But I've done a lot of work in electronic publishing and I was curious to see how the experience was. I bought a copy of The Tipping Point, partly so Malcolm Gladwell would get a little more money—he's awesome. It's worth pointing out that I have a paperback copy of The Tipping Point sitting on my desk, as I intend to re-read it, since I only got about halfway through the last time I tried, many years ago.

So here's why the Kindle will lose, and the iPad will win....

I have the iPad with me because of all the things it does. I can read my email, do my online banking, or whatever I think needs doing. But I found myself clicking over to read a few pages of The Tipping Point now and then, when facebook was boring and I had no new email. And I've read about 100 pages of The Tipping Point now, to my surprise.

The crux of it is this: if you have to bring something extra with you in case you want to read, you just won't. Maybe you will for a while, but have you ever brought a book on an airplane, in your carry-on, and gotten back home having not even cracked it open, and wondered why you lugged all that weight around the whole trip? You tend not to do it the next time—you leave the book at home.

And that's precisely the point: the book is always with you, because it's not an extra thing to bring, it's just built right into something you'll probably have with you anyway—and it's just a click or two away, if you already have that device in your hand. Or 100 books, for that matter.

This is a game changer for reading, in my opinion. It is working on me, and I'm a tough audience.

Tuesday, March 16, 2010

Password Security

I have a new web hosting service on one of my web sites that insists that I use a password that they call STRONG. They forced me to change my existing password, and won't accept my ideas for a new password without passing their meter for STRONG passwords.

It irritates me that they think they know better than I, a 25-year computer industry veteran, what makes a good password. They are, in fact, wrong about that. I know better than they do.

First of all, there are really only three ways that your password can be "cracked":
  1. you are an idiot and post your password somewhere it can be read
  2. intuitive guessing by someone who knows something about you
  3. automated, algorithmic guessing by a hacker's computer
Let's assume that 1 won't happen.

Making a password safe against intuitive guessing is a very good idea. Don't use your pet's name, your name spelled backwards, or anything like that. There is a lot to say about this kind of password thinking, but let's just say this: the hacker's computer is not intuitive. That is, they can't really apply a lot of these tests to see if your password is good, like "hmmm, that is the name of Glenn's cat, spelled backward." I don't have a cat. Their computer doesn't know that. A hacker might know that, because they read your blog. Be very careful about what you think people don't know about you, because they just might. My old admin from many years ago, who embezzled from my company, had a cat named Soonie. I bet she didn't know that I knew that.

So assuming you are clever about avoiding intuitive guessing, this leaves essentially only the automated cracking approach. The idea here is that somebody writes a password cracking program that will repeatedly try, for example, all the words in the dictionary, spelled forwards and backwards and with random capitalization, to hack into your account.

Let's look at that for a moment. First of all, no way. I defy anyone to prove that anyone has ever had their account compromised like this. Most computers, and sites, in fact, only allow a few incorrect passwords before suspending your account and not even letting you guess your own password.

That's why everybody puts their elaborate, crack-proof passwords into a Word document, prints it out, and puts it on the wall of their cubicles, because if you forget your own STRONG password that some web site made you choose, you can't get into your own bank account!

But back to the main thread. Let's assume that your site does not have a limit on incorrect attempts, or an "exponential time decay", which is a better way to do the same thing (it allows more guessing, but waits for longer and longer intervals in between each incorrect guess). These techniques completely eliminate algorithmic password guessing, right? So why would you also insist that your users make a ridiculous password? My point exactly.

But there's more. The hosting service in question (okay, I'll name them: bluehost.com) makes you use at least one number, at least one punctuation mark, at least one capital letter, more than 8 letters, etc. Why?

Remember the "tens place" and the "hundreds" place? If you just use digits from 0-9, then there are only ten possibilities for each digit. So a 4-digit number has 10,000 possibilities. If you include all of the ASCII character set, you have 256 possible "characters" in each location, so it's not 10*10*10*10, it's 256*256*256*256, which is 4,294,967,296 possibilities. That's a LOT, isn't it?

So why do I need at least an 8-digit password, if a 4-digit password has 4.2 billion possibilities?

And why do you insist that I use capital letters? That doesn't actually help. The possibility that I might use a capital letter, or punctuation, or a digit, is how we get to 256 possibilities for each letter. The automated guessing program doesn't know if I used capitals or not, so it has to guess them anyway. My password is not more secure just because my web site thinks that I need to use "at least one capital letter". I didn't try it, but I wonder if they also insist that I use "at least one lower-case letter". That should be equally important, if the goal is variety.

If what you're trying to do is outwit automated guessing programs, a 2-digit password might be even more secure than an 8-digit password, because the programs might not bother guessing 2-digit passwords, figuring nobody would be that stupid. If they don't try it, then they won't succeed in guessing it, right? So maybe a 2-digit password is actually more secure!

And what's totally ironic about requiring that I use at least 8 letters is that it makes the cracking much easier. They just eliminated 1.8e+19 (more than 1 quintillion, or a billion billion) perfectly good passwords that now the cracking program doesn't even have to try, because they are disallowed.

My point is that all of these "safeguards" to make your password more secure against automated guessers is, first of all, a red herring, since I don't think there are really password guessers out there trying to hack into my rightbrain.com web site, and second of all, they don't actually reduce the chances of them guessing my password correctly. The automated guesser is either going to methodically try all 4.2 billion possibilities, or it's not. If it does, it will eventually guess my password, no matter how STRONG it is. If it's not, then if I'm clever enough to keep my friends from guessing it -- it's secure!

I've had passwords on things for 30 years and nobody has guessed any of my passwords. They won't, either, unless I have to write down a "STRONG" password because there's no way I can actually remember it.

The most secure password is one that you don't have to write down, because it prevents people from just finding where you wrote it down. That's why people use their cat's names, spelled backwards. Forcing me to come up with some random sequence of improbable letters makes it much more likely that I'll write it down somewhere.

Give me a break. And let me choose my own passwords, please. You can give me feedback on what you think is "good", but don't force me to use your rules. It's not more secure, it really isn't!


Sunday, January 24, 2010

10 Years Ago...

A friend of mine sent a question to his network on New Year's Eve. He is 38, and I happen to be 48, and his question was this: "I wish I could ask my 48-year-old self for advice, so I know what might be coming in the next decade". So, being wiser than many of his peers, he tried to reach out to people who had just gone through that decade, to see if there was something to learn. I admire the attempt, and I have no idea whether or not one can truly learn from others' experiences. But we have to try, right?

So here is my answer to his question. I don't know if it's interesting, or helpful, either to him or to you, but I thought it might be, and I felt like posting it on my blog. And of course the one thing that's true of blogs is that it doesn't have to be interesting, or helpful. It's my blog, isn't it?

----------

From: Glenn Reid
To:
Re: 10 Years Ago...

>I'm 38, what do I need to know for the next 10 years?
>Answer this however you like (or not!)... use whatever color, examples,
>personal stories or judgments of me as you see fit. You may not even
>know what to say to 38 year old me... but what would you say to
>your own "mid-to-late 30s" self?

Hey Jon,

I'm not sure how many people you sent this question to, or how much response you've received. I have been kind of chewing on this in the back of my mind for a while -- wondering what to say, I guess. I am 48, which perhaps you knew :)

There is no really good answer to your question, I don't think, because experience is not universal. One of the weaknesses of human beings, I believe, is that we fundamentally don't "learn from experience". We think we do, but really we just keep doing the same stuff over and over.

At the core of how you view the world is your belief system. It is built up over years of experience, teachings, accidents, etc. It just represents what you believe to be true. As new information comes to you, in the form of seeing/hearing/experiencing things, the new data either reinforces your belief system or contradicts it. It is how you view contradictory information that defines how you interact with the world. Some people throw out their belief system easily (or large chunks of it) and adopt new theories about life all the time. They are vegan one year, Atkins diet the next. On the other end of the spectrum are those who vigorously defend their belief system against all contradictory information. Those people are generally Catholic, or Republican, or whatever :)

Here's an example of this concept at work, in seeing into the future. This has nothing to do with "you" per se, but I will use the word "you" to make it easier to express.

Your belief system tells you, perhaps, that you are good at what you do, at your chosen profession. You truly believe you're a pretty dang good carpenter. Yet the data may suggest otherwise. You've been laid off, have not gotten promoted, or otherwise fall into the middle to bottom of the pack. If you actually accept this "input" that you're really not doing so well, you might either (a) reject it, and buy a bigger truck, or (b) decide you're a failure and take up a new career. Yet (b) is difficult and perhaps foolhardy, at mid-life. If you're not a great carpenter, what makes you think you can suddenly start selling real estate successfully? So most people muddle along making small changes and justifying the rest.

So to get to the point (if in fact I have one)...

The next decade you're facing, if I were to try to sum it up across most of the people I know, and based on personal experience, is the decade of letting go of some of your dreams.

You know how everybody tells you how fast your kids grow up, and you just kind of listen to them, but increasingly you see glimpses of that yourself? "Wow, that was *three years ago*". Or you see a niece or nephew going off to college and you remember when they were born.

Time really does go faster as you get older, or seems to. I think the reason is that you start to accept something deep and fundamental, that you really don't want to accept. It is best summed up like this: "you may never pass this way again."

I have thought that explicitly, and more and more often. I was in Hutchinson, Kansas, for the first (and last) time, a year and half ago. It was for an antique truck show, but the reason is unimportant. I looked around and thought, "Wow, I will never come here again in my life. That's kind of weird."

It's not that it's true, or not true. It's that you think it at all. When you're 20, you simply don't have thoughts like that. You assume that you can, and will, go everywhere, do everything, and kick all available butt. There is endless time, you are strong and hungry and ready to rock (usually). You go to Alaska, and you think, "hell, the next time I come here I'm going to rent a plane and fly up to that lake" or whatever. If you go to Alaska when you're 48, you very likely will not think that. You will think, "Wow, I will probably never see this place again in my lifetime."

The reason is subtle: it's not that you couldn't go back to Alaska every year if you wanted to. But you won't really want to, and you know it. You've "been there, done that". And you know that you'd rather do something else with what time you have left. Go to Egypt, maybe.

And therein lies the heart of it. You only have "so much time" left, and you want to spend it more and more wisely. That is old age, when you get down to it. You aren't entering old age, exactly, but your experiences will start to show you that you really aren't going to "get around to" a lot of stuff that you thought you were. And you will subtly, but permanently, let go of a dream or two, in the coming decade. And you will think, at least occasionally, that you are entering the second and final half of your life.

Make that a good thing, not a bad thing. Don't let dreams slip away -- shoot them in the head, and take on new, attainable ones. Having kids is an incredible dream for your own future that you probably didn't really have on your Most Important list when you were 28. But now you know how cool and important it is, so it's not really such a bad thing to let go of some other dream, like having Andalusia open for Ringo Starr. Not that that was ever your dream, of course :)

As a tangentential, but related thought, I think that the reason that "old people" don't take easily to new technology has nothing to do with their ability to deal with it, or any kind of cognitive issues of complexity. And it's not Fear, which is often cited. It's more simple than that. It's because "old people" value their time more and more, and they understand that learning the user interface on a BlackBerry will be useless knowledge in 10 years (or 5, or 3) and that it's simply not worth investing their time. It's a variation on the reason that high school kids don't want to learn Trigonometry: "when will I EVER need this in real life?"

The benefit-to-time ratio is calculated more frequently, and more easily, as you get older, and you just know when something is worth it and when something is not. I am starting to realize that now about myself, and it surprises me. I don't bother to learn all the things my iPhone can do, or install apps on it, or whatever. Not because I "can't handle it". I can develop software for the iPhone if I want to. But I don't. Because the iPhone will also be on the scrap heap of history in 5 years, and I don't want to waste time installing apps that won't work in a few years, or the company will have disappeared. It's like investing in video formats, or audio formats. When MP3 is slightly improved upon by AAC, do you really go back and rip all 1,000 of your CD's into the newer format. I didn't think so. When you bought some of those CD's, 10 years ago, did it ever cross your mind that some day they will be obsolete, and wonder if it was worth investing in them? I didn't think so. 10 years from now, when you're 48, I'm pretty sure you will do that calculation in your head when you're considering buying music in some format or other. I know, I know: "what, buy music, are you kidding?!"

Have fun with your new kid, and your "old" one. Say hi to Kirsten for me. And have a good decade :)

Glenn