Monthly Archives: July 2014

Write it down if it’s important.

Increasingly, I hear professors giving a lecture saying something like “don’t worry about taking notes because the PowerPoint slides will be posted.”  Having a copy of the lecture slides is obviously incredibly helpful when reviewing.  However, given that some of most solidified knowledge I remember came from painstakingly recorded class notes (or a very, very funny professor), the “do X because Y” correlation with note-taking strikes me as strangely dissonant.

A lecturer who recommends against taking notes makes the following assumptions.  (1) The delivered lecture/speech can be fully captured using a set of PowerPoint slides.  (2) Reviewing his/her PowerPoint slides provides near-identical experience as reviewing one’s own paraphrase of those relevant learning points. Assumption #1 is one the lecturer makes of the educational content itself and is outside of the learner’s control.  However, assumption #2 is one made about the learner, and I’m not so sure that it’s true.

In the digital age, the world has moved away from manual production of information and into data automatism.  Book used to require manual copying which was labor-intensive and expensive.  It gave the actual reproduction of writing value.  The advent of printing made the reproduction of information dramatically cheaper, but creating information de novo was still labor-intensive and considered valuable.  Then came the arrival of the computer file system and electronic books (quick age test: when you think “file” do you think a computer folder with word documents or an actual vanilla folder with paper files?).

On the other hand, the cost of creating good information improved more slowly.  The labor of recording creative thoughts has decreased: we no longer carve words onto tree barks; some of us even stopped writing on paper altogether.  However, creating information ultimately relies on an innate ability to convert thoughts into something the five senses can digest – words, images, sounds, gestures, dances.

So the underlying question is this: is “taking notes” a creative or replicative learning process for you?

Moving from past perfect to simple future

“What would have been” is easy to imagine.  It’s everything that we don’t have but we want, glazed with the syrup of optimism and a flare of fiction.

“What will be” is also easy to imagine.  It’s everything that hasn’t happened yet but will inevitably become pending our next actionable step, permeated with the grating texture of reality and a hint of truth.

The past perfect tense is exactly what it is – it’s perfect.  But “what would have been” is not quite past perfect.  It’s actually past conditional perfect tense. Conditional because we should have made that perfect decision in the past, but now it exists only in the imagination.

“What will be” is a simple future tense.  It looks ahead with a prediction of the near future.  It’s not quite “what will have been.”  The future perfect is a little far ahead, a little scant on realism.

Simple future isn’t necessarily better or worse than the past perfect conditional or the future perfect.  However, it is different, and we sometimes think too little about it.   So next time you found yourself looking back and thinking down a bifurcation towards a fictional future, it might be worth asking yourself “what’s the next actionable step, and am I willing to take it?”  It brings out the real you.

A Double Take on “What’s Your Take?”

“What’s your take?” is a question people sometimes ask when they want your opinions on a subject at hand.  Sometimes it also mean they are actually asking whether you agree with them.  If you were to agree, you get the opportunity to paraphrase their opinions.  If you were to disagree, the phrasing is such that you aren’t forced to start the response “No” as would be necessary with the question “do you agree?” (or risk not actually answering the simple yes-or-no question).

It’s an opportunity to create an engaging discussion without confrontation, a question worth pondering over for a few seconds before answering.

What’s better than better?

Throughout management training I was taught “More isn’t better; better is better.”

But there’s a problem with being better. To be better means to be compared against something.  Sometimes it’s competition against another person – do better than that rival.  Sometimes it’s competition against the self – do better than what you did yesterday.  Sometimes it’s competition against an ideal – as in “you can do better.” Being better implies optimizing on something, that somewhere above where we stand exists a higher level of achievement.  To be better means to take what we  already do and improve it based on the evaluatives of an existing rubric.

The problem with “better,” then, is that it only works when your customers – and here “customers” takes on a wide meaning: patients, buyers, parents, whatever – uses the same rubric that you do.

Being better is also very hard to do – to find the “Best” out of N choices, a computer makes N-1 comparisons, each comparison based on a rubric of many parameters. Humans take shortcuts by substituting this algorithm with a heuristic – a much easier question.  Most of the time, “who’s the best doctor/plumber/dogsitter” gets subconsciously substituted with “who comes up in your head first when I mention doctor/plumber/dogsitter?”

In other words, our customers very frequently end up thinking very differently about “better,”  and being better isn’t always better.

We end up with ourvery own type of better – how we uniquely contribute to the team, the organization, the customer.  How we communicate these qualities shapes how we differ from their choices. Different is how people remember us, walking away from that first meeting.  Different is what stops us from becoming a substitutable commodity.  Different is better than better.

Taking the second step

We all know getting started is tough; that’s not news.  Writing the introduction of your paper, the first day on a new job, starting a company, all tough tasks.  Taking the first step requires a certain amount of know-how.

Taking the second step, though, is an entirely different matter.  It requires trusting that the first step you took was in the right direction, and that you are ready to commit and take things further.  With the first step, you are just experimenting.

The second step requires courage. It is what transforms a footprint into a path.