Friday, October 28, 2016

Paul's Update Special 10/28



Fifth-generation wireless (5G)

5G is just a marketing term right now, but there is no denying that mobile data consumption is exploding, and all of our future technologies will require vastly faster, ubiquitous wireless connectivity. Demands on networks are doubling every year. At this rate, with a bit of quick math, we can see that in the next decade we will have 1,000x the demand for mobile data.

To meet these skyrocketing demands, we need to densify the mobile networks of years past. This means moving from macrocell sites that cover neighborhoods to small cells that cover blocks, down to femtocells and picocells that bring high speed, synchronized connectivity to your home and human-scale settings. Bandwidth has become the lifeblood of cities as much as water, good roads or electricity have supported thriving cities in the past. It’s really the base technology that all others are built on. Look for ubiquitous gigabit-speed wireless in leading cities over the next decade.

Computer vision (CV)

Of all the sensors available, video-as-a-sensor will emerge as the most important tool — and maybe the most controversial. CV allows the broadest range of possibilities and supports the greatest number of possible use cases. From understanding density of populations, to usage patterns, to speed of traffic, to how resources are being used, CV will quite literally be the eyes of the city. The real power of these technologies will be unlocked when we combine them.

This comes with well-warranted concerns for privacy, but with pioneers like Intel managing tight security and on-device processing, and Movidius promising insanely detailed chip-based object recognition, there’s reason to believe we’ll arrive at a scenario that protects our privacy. And if you can’t imagine the city being covered in cameras, take a look around and count the number of cameras already in the city. Nearly every store, street corner, cab and cop has a camera. The change will occur in swapping out the human who reviews a recording today for a computer that processes the images instead.

Mixed reality (MR)

Where virtual reality, augmented reality and plain-old reality mix together to bring digital overlays that incorporate real physics and computer graphics and create immersive experiences with the best of the physical and digital, we call this mixed reality. MR has perhaps the most limitless possibilities of all five technologies listed here, and likely will serve as the eventual replacement to the mobile phone.

From virtual goods that reduce strains on our resources to overlays on the world for entertainment, education and work, MR has astounding implications. With the help of companies like Samsung, Google and Magic Leap, use cases and interfaces that Google Glass or Pokémon Go have hinted at will transform into seamless, natural combinations of the internet, physical city infrastructure and society. Imagine changing the architecture of a city on-the-fly, or making certain information about yourself visible to different circles, like your relationship status in a dating-overlay or your blood type and heart rate for first responders.

Autonomous vehicles (AV)

We have more than one billion cars on the planet. That’s trillions of dollars of rapidly depreciating infrastructure sitting on the streets, used for only a fraction of its life. And even when cars are used, their most efficient use (when driven close to capacity at the top of their performance profile) happens briefly and rarely, like when you are loaded up with kids and all your stuff from your summer vacation.

We have given these cars billions of square feet of prime real estate in cities around the world. Autonomous vehicles not only have the potential to change the idea of car ownership and last-mile travel, but also radically change the way we manage logistics and delivery of goods. We will see cities’ use of space and people’s travel habits change dramatically over the next decade, enabled by changes to vehicle sizes and the addition of intelligent routing, breaking down car travel into everything from package-delivering drones (think Amazon Prime Air or Starship), to micro-buses making commutes efficient and cheap, to intelligent, tiny, easy-to-use single-person rideables.

Artificial intelligence (AI)

We need brains to bring these technologies together and make them work. And we’re not talking about order-taking robots or computers we need to program with every detail. AI should let us manage by objective: tell the system what we want to achieve, help it when it needs it and course correct it as it goes along.

With learning systems that can scale massively in scope via cloud computing and maintain responsiveness across billions of interactions on the most minute level via in-device edge processing, AI is the most unpredictable and existential technology of the bunch. It has the potential to bring things together to help us solve critical macro issues — we can use AI to make vastly better use of our resources, solve equity issues and prevent crime — and the smaller, more personal challenges, like finding a better way to get to work, meet a mate or optimize your schedule to better suit your desired lifestyle.

Keep your eyes open and you’ll see these five technologies popping up in development today, and then gradually becoming a core part of pretty much all of our interactions with each other and the city over the next 10 years.



Trying to create urgency reflects the age-old confusion of hurrying with speed — the misguided notion that if you’re not always hurrying, you’re already behind. As anyone who has ever forgotten their keys in an effort to get out of the house on time knows, hurry often backfires. The panicked, frantic pace values action over results and has consequences.

Shortcuts and sloppiness. Even a top-notch team isn’t immune from the pressure, and little things start to pile up — small UX issues we don’t have time to fix, less than thorough test coverage, failure to monitor key processes and tech debt in a new feature.

Limited space for creative solutions. Stepping back takes time, so it’s often discouraged in pro-urgency environments, but it’s in these minutes or hours that weeks and months are saved.

Micromanagement. Trying to create a sense of urgency takes effort. Be careful dismissing this point too quickly — even if you would never micromanage, you could be creating the wrong incentives for managers in your group.

Loses potency quickly. Artificial urgency for milestones loses its potency quickly.

Takes over communication stream. When communicating urgency is the highest priority message, more important messages like why are we taking on this project and who is going to benefit? — the very messages that get great results — are drowned out.

Let’s retire sense of urgency and instead look for sense of purpose: 

  • A sense of purpose is a deep understanding of the reasons behind our efforts and a desire to pour in time and energy because that purpose resonates with the impact we’d like to make on the world.
  • A sense of purpose is immersion in our cause, and allowing that exposure to motivate action. 
  • A sense of purpose is about going faster and smarter toward a mission we all see clearly. 
  • It’s about using good judgment because we all understand the short and long term implications of our actions on what we’re creating together.
  • A strong sense of purpose manifests when a software engineer watches a potential customer struggle with a workflow and stays late to make the changes that make it easier. 
  • It shows itself when a designer spends their weekend on a few extra iterations because they felt engaged with the problem at hand and want to produce a better solution.

Turns out when you stop looking to create urgency, the passion and purpose latent in your team might just cause the right things to get done at just the right pace.

Crafting a sense of purpose is different than creating urgency. A team with a high sense of purpose can look a lot like a team with a high sense of urgency. Output is high. People are engaged. The critical distinction is that what you do as a leader has very little overlap. Creating a sense of urgency is about deadlines, nagging and sheer speed. 

Fostering a sense of purpose is different. It’s a collaborative endeavor, and it requires trust that your team members will translate their sense of purpose into increased effectiveness.

Your primary job as a leader is to hire the right team, and then to spend time inspiring this sense of purpose. Help people understand the impact of their work, and speed will follow.



In his latest book, Competing Against Luck: The Story of Innovation and Customer Choice, Clayton Christensen, a Harvard Business School professor, with co-authors Taddy Hall, Karen Dillon, and David Duncan, tries to explain why some products are successful and so many are not. The difference is based on what he calls the “jobs to be done” theory.

Customers, he says, aren’t really interested in products or services themselves but in what they do. When people have a need, that’s a job to be done; products or services are simply “hired” to solve them. Innovative companies focus on developing products that do the jobs customers want, no matter what form they take. 

Quartz magazine interviewed Christensen: Quartz: Why did you call your book “Competing Against Luck?” It seems like the real competition is not luck but a company’s own flawed understanding of its product and its customers.

Christensen: The reason we chose that was every time a company puts a new product into the market, they’ve done their market research as best they can and they believe it will be successful, but the data shows that somewhere around 20% will be successful and 80% won’t and they thought the other 80% would be successful.

Quartz:How does the jobs-to-be-done theory fit into your more established disruptive-innovation theory? They seem to nest together.

Christensen: At its core, disruption is a theory about competitive response. If I have a new innovation I want to introduce into the marketplace, I want to predict if competitors in that market are going to ignore me or going to fight me. The theory of disruption helps you understand that quite well. But in many ways, it’s not a manual for how to grow or how to predict what customers want. [Jobs to be done] is the second side of the same coin: How can I be sure that competitors won’t kill me and how can I be sure customers will want to buy the product? So it’s actually a very important compliment to disruption.

If you organize your company around a job to be done, it’s actually a lot harder for a new entrant to disrupt you, because most disruptive companies just have my product versus your product, and mine’s cheaper than yours. Jobs to be done is actually good protection for companies worried about being disrupted.

Quartz: How do you account for things that no one thinks they need but turn out of be very useful or desirable.

Christensen: We discover jobs to be done, we don’t develop them or consciously iterate toward them. In the book, we’re tying to say, now that you understand the idea, there’s a methodology that will help you identify these jobs. You’ve got to keep your ear to the ground, because you’ll happen upon the job.

Quartz: But as someone who knows a lot about branding, you might want to be associated with this powerful brand of disruption.

Christensen: I have never wanted to be famous.



It’s a condition familiar to a broad swath of American workers. You need a free stretch of time to tackle a problem or concentrate on a piece of writing. But diversions and interruptions keep coming.

The lure of a place apart, if only a psychological one, is a recurring theme in Deep Work: Rules for Focused Success in a Distracted World, written by Cal Newport, a Georgetown University professor of computer science specializing in the theory of distributed algorithms, the popular new book that argues for the virtues of longer periods of time for uninterrupted thinking.

“Spend enough time in a state of frenetic shallowness and you permanently reduce your capacity to perform deep work,” Newport warns.

“I think it is a timely book, says Adam Grant, Wharton management professor and author (and himself the subject of one of Newport’s chapters). “One of the things we know from nearly a century of research is that people are not good at parallel processing. They are good at serial processing. And where people never really fully engage, it’s hard to get a lot of work done. Cal has done a terrific job of highlighting how intense focus gets better results both in terms of quality and quantity.”

Newport uses various writers, scientists, executives and academics – including himself – as examples of those who can produce at a high rate while “rarely working past 5 p.m. or 6 p.m. during the workweek.” The key is deep work. There is a neurological basis for how sustained concentration yields improvements — in everything from playing a musical instrument to solving complex problems. Newport cites science suggesting that the more you do something, the more you develop the layer of the tissue myelin around corresponding circuits. “To be great at something is to be well myelinated,” he writes. “This repetitive use of a specific circuit triggers cells called oligodendrocytes to begin wrapping layers of myelin around the neurons in the circuits – effectively cementing the skill.”

There is much in the way jobs are organized today that is at odds with producing high-quality results. Multitasking can be a drain on concentration. But even moving among projects – in the way that many workers go from one meeting to the next – comes with a built-in inefficiency.

Deep work is a matter of developing the right habits or rules — or at least, the right rules for you.

Newport’s recommendations boil down to vigilantly guarding mission and time. Schedule every minute of the day, he advises. Carefully guard the time you need to focus. Tools must be curated, he said, they must meet a test. Does each tool have a substantially positive impact, a substantially negative impact, or little impact on your goals?

Deep work, says Wharton emeritus management professor Marshall W. Meyer, is partly derived from the creativity research of Mihaly Csikszentmihalyi and his theories of flow — the state of complete absorption in which ideas seems to build upon each other. “The difference is time,” says Meyer. “Today we’re distracted by messaging and social media. We need respite.”

And better judgment. Confronted with endless options for diversion, a lack of constraints, and multiplying modes of communication vying for attention, workers need structure and criteria for making choices on where to expend a limited supply of attention. “It’s very simple,” says Meyer. “You make the firm strategy-centric, in the sense that you go through the exercise — the vision, the mission, the implementation of strategy, and you remind people that everything they do has to link back to those elements of the mission. If it makes sense, pay attention. If it doesn’t, don’t. Because strategy is a focusing tool.”

Friday, October 21, 2016

Paul's Special Update 10/21




“There is no point in having a 5-year plan in this industry. With each step forward, the landscape you’re walking on changes. So we have a pretty good idea of where we want to be in six months, and where we want to be in thirty years. And every six months we take another look at where we want to be in thirty years to plan out the next six months.”
— Facebook’s Little Red Book

In our industry, our reference frame is day-by-day, sprint-by-sprint, feature-by-feature. What’s happening today, tomorrow, next month? We’re intimately familiar with the details but unaware of the bigger picture. We can easily become myopic to the pace of technology and to the larger forces that will ultimately have more influence whether our product succeeds or fails more than which feature comes next.

When we read about failed companies such as Blockbuster, Kodak, or RIM, we’re often told that they were “caught by surprise” or “didn’t see it coming.” The truth is much more complicated. In each of these examples, the companies were acutely aware of the threat posed by the new technology. In fact, they were often the first to see it—Kodak invented digital photography, and RIM brought the first smartphones to market. What they were wrong about, however, was just how fast the technology was moving. They had the wrong frame of reference. 

We fall into a curious trap when we think about the future—we know too much to be dangerous. Our timeframe is too short. Measured on this scale, changes are difficult to notice. Think about this the next time you read a product review.

But what if I ask you to imagine your product in thirty years? Something appealing happens when you contemplate that time horizon. It’s so far into the future that the little details have to fall away. Zooming out to see the industry at a geological time scale brings things into focus: technology is progressing faster than most of us imagined, and will only continue to do so.

The year 1986—thirty years ago—fell within most of our lifetimes (or close enough). It wasn’t that long ago. But think about how far technology has advanced since 1986. The personal computer on a worker’s 1986 desktop has morphed into the smartphone, a device that fits into our pockets, is more than 300x faster, has 2,000x as much memory, and costs less than 5% as much. (Oh, and it’s always connected to everyone else on Earth and the entire sum of human knowledge.)

If you want to prepare for the future, move the goalposts. Try a thirty-year plan. This exercise aims to answer, “where might the world be?” We want to think about the external forces that will shape the feature, not the features derived from them. This is also about forecasting, not predicting. Predictions concern themselves with future certainties, whereas forecasts map out ranges of possible outcomes, continuously updated as more information becomes available. Think weather forecast, not lottery predictions.

In many ways, this is similar to scenario planning. Peter Schwartz, one of the gurus of scenario planning, describes it as a process where “managers invent and then consider, in depth, several varied stories of equally plausible futures.” Schwartz offers a useful framework for leading the discussion and contemplating the types of forces that will shape your future. He uses the handy acronym steep—Social, Technological, Economic, Environmental, and Political. 

When you get to the T-forces (Technological), be careful. This is where you can fall into the trap of trying to predict products and technologies rather than imagining trends and forces. To help with this, I recommend consulting Kevin Kelly’s latest book, The Inevitable. Unlike futurists who try to predict, Kevin focuses instead on identifying forces, which he calls “motion verbs”:

I make no attempt to predict which specific products will prevail next year or the next decade, let alone which companies will triumph. These specifics are decided by fads, fashion, and commerce, and are wholly unpredictable. But the general trends of the products and services in 30 years are currently visible.

Their basic forms are rooted in directions generated by emerging technologies now on their way to ubiquity. This wide, fast-moving system of technology bends the culture subtly, but steadily, so it amplifies the following forces: Becoming, Cognifying, Flowing, Screening, Accessing, Sharing, Filtering, Remixing, Interacting, Tracking, Questioning, and then Beginning.

Once you have considered what these possible futures look like, you can form an opinion about where your product should go, which long-term trends you can’t ignore, and against which trends you might need to hedge. Your periodic thirty-year plan feeds into your six-month plan. It should also help you with every product manager’s toughest challenge: deciding which things not to do. Which efforts just aren’t important to the long-term yet distract you from what does matter?

This exercise doesn’t have to happen often—once or twice a year is probably ideal. It is possible to spend too much time with your head in the far future and end up being way too early. Periodically setting aside the ant’s eye view for an alien’s tour of the universe can help you put the future into perspective; plus it’s fun. It’s sooner than you think.



Ever since an 18th-century biologist called Luigi Galvani made a dead frog twitch we have known that there is a connection between electricity and the operation of the nervous system. We now know that the signals in neurons in the brain are propagated as pulses of electrical potential, whose effects can be detected by electrodes in close proximity. So in principle, we should be able to build an outward neural interface system – that is to say, a device that turns thought into action.

In fact, we already have the first outward neural interface system to be tested in humans. It is called BrainGate and consists of an array of micro-electrodes, implanted into the part of the brain concerned with controlling arm movements. Signals from the micro-electrodes are decoded and used to control the movement of a cursor on a screen, or the motion of a robotic arm.

A crucial feature of these systems is the need for some kind of feedback. A patient must be able to see the effect of their willed patterns of thought on the movement of the cursor. What’s remarkable is the ability of the brain to adapt to these artificial systems, learning to control them better.

Inward neural interfaces – ones that provide inputs to the brain – also depend on the brain’s ability to adapt to them. 
  • Cochlear implants, which can restore some measure of hearing to the profoundly deaf, have been around for several decades now. These take signals from an external microphone, and after signal processing, transmit a series of pulses to electrodes that excite the auditory nerve. The pulses depend on the brain’s impressive ability to learn to adapt to the new kinds of input.
  • The first trials of retinal implants have now taken place, in which signals from a camera are used to stimulate retinal neurons in vision-impaired patients. 

The key message of all this is that brain interfaces now are a reality and that the current versions will undoubtedly be improved. Until then, our current neural interface systems are very crude. One problem is size; the micro-electrodes in use now, with diameters of tens of microns, may seem tiny, but they are still coarse compared to the sub-micron dimensions of individual nerve fibres. And there is a problem of scale. The arm movement control system, for example, consists of 100 micro-electrodes in a square array; compare that to the many tens of billions of neurons in the brain. The fact these devices work at all is perhaps more a testament to the adaptability of the human brain than to our technological prowess.

So the challenge is to build neural interfaces on scales that better match the structures of biology. Here, we move into the world of nanotechnology. Perhaps the most promising direction will be to create a 3D “scaffold” incorporating nano-electronics, and then to persuade growing nerve cells to infiltrate it to create what would in effect be cyborg tissue – living cells and inorganic electronics intimately mixed. This prospect might be achievable in our lifetimes, but what does remain very far away is the transhumanist dream of being able to obtain a complete readout of the brain – a transcript of the state of the mind. 

As brain interfaces improve, they will bring real benefits to many, and some ethical issues too. We will still be a long way from the seamless integration of humans and machines, but the science fiction vision of the cyborg will become real enough to give us pause for thought.



Findings from the 2016 digital business global executive study and research report.

Many companies are responding to an increasingly digital market environment by adding roles with a digital focus or changing traditional roles to have a digital orientation. Nearly 90% of respondents to a 2015 global survey of managers and executives conducted by MIT Sloan Management Review and Deloitte1 anticipate that their industries will be disrupted by digital trends to a great or moderate extent, but only 44% say their organizations are adequately preparing for the disruptions to come.

Preparing for a digital future is no easy task. It means developing digital capabilities in which a company’s activities, people, culture, and structure are in sync and aligned toward a set of organizational goals. Most companies, however, are constrained by a lack of resources, a lack of talent, and the pull of other priorities, leaving executives to manage digital initiatives that either take the form of projects or are limited to activities within a given division, function, or channel.

A key finding in this year’s study is that digitally maturing organizations have organizational cultures that share common features. These features consistently appear in digitally maturing companies across different industries. The main characteristics of digital cultures include: an expanded appetite for risk, rapid experimentation, heavy investment in talent, and recruiting and developing leaders who excel at “soft” skills. Leading a digital company does not require technologists at the helm.

To help companies better prepare for their digital futures, we delved into how digitally maturing organizations strengthen their cultures and develop the talent that drives them. Highlights of our findings include the following:
  • Creating an effective digital culture is an intentional effort:
    Nearly 80% of respondents from digitally maturing companies say their companies are actively engaged in efforts to bolster risk taking, agility, and collaboration.
  • Senior-level talent appears more committed to digitally maturing enterprises:
    Companies that give their senior vice presidents, vice presidents, and director-level leaders the resources and opportunities to develop themselves in a digital environment are more likely to retain their talent.
  • Digitally maturing organizations invest in their own talent:
    More than 75% of digitally maturing organizations surveyed provide their employees with resources and opportunities to develop their digital acumen, compared to only 14% of early-stage companies. Success appears to breed success — 71% of digitally maturing companies say they are able to attract new talent based on their use of digital.
  • Soft skills trump technology knowledge in driving digital transformation:
    When asked about the most important skill for leaders to succeed in a digital environment, only 18% of respondents listed technological skills as most important. Instead, they highlighted managerial attributes such as having a transformative vision (22%), being a forward thinker (20%), having a change-oriented mindset (18%), or other leadership and collaborative skills (22%). A similar emphasis on organizational skills above technical ones for succeeding in digital environments was also reported for employees.
  • Digital congruence is the crux:
    To navigate the complexity of digital business, companies should consider embracing what we call digital congruence — culture, people, structure, and tasks aligned with each other, company strategy, and the challenges of a constantly changing digital landscape. Similarly, an organization with a flat and nimble structure may still struggle if its culture fears risk. When cul­ture, people, structure, and tasks are firing in sync, however, businesses can move forward successfully and confidently.



Once you have identified a great idea for a new product or service innovation what do you do?  Many firms put together a cross-functional team (XFT) and tell them to bring the product to market.  It seems like a good approach. They have budget, people, and empowerment.  What could possibly go wrong?  Lots of things. Here are six common reasons why XFTs fail to deliver.

1. No High-Level Sponsor
Ideally, the team should report to someone in the C-suite. The dilemma is that they have to be empowered and left alone but they also need access to high-level authority on occasion. 

2.  Too much Oversight and Control
They should be allowed to bypass many internal approval processes and take control of the project themselves.

3. Wrong or Unclear Objectives
The emphasis should be on fast feedback rather than fast payback. Vague goals are not much help but even worse are targets which are too tight, ambitious and restrictive. 

4. Wrong Mix of Skills and Functions
You need a diverse team with the right mix of skills.  If all the people on the team are creative types then they have great ideas but nothing much gets done. 

5. Insufficient Resources
The team cannot be expected to produce much if they can only meet after work every third Tuesday. They need time, space and money. 

6. Opposition from Vested Interests
The biggest problem is often one that is not anticipated – political opposition from internal departments who see them as a nuisance or a threat. 

There is nothing wrong with establishing a cross functional team to implement an innovation project.  However, you cannot just set it up and forget about it.  They need support, direction, and encouragement.  They also need freedom and empowerment.  Above all, they need help to overcome all the business-as-usual pressures which will oppose an innovation initiative.