Managing sustained innovation for a smarter planet - Business Works
BW brief

Managing sustained innovation for a smarter planet

by Dr Bernard Meyerson, IBM Fellow and VP Innovation Extraordinary outcomes can be achieved when the right people, infrastructure and motivation come together, says Dr Bernard Meyerson, IBM Fellow and VP Innovation, ahead of his IET/BCS Turing Lecture 'Beyond silicon: cognition and much, much more’.

Innovation has always been essential for any organisation wanting to grow. The importance of sustained innovation however, has been less widely recognised up until recently, but is taking on more significance as a result of two key factors.

Firstly, many companies that failed to sustain innovation during the downturn have subsequently struggled or not survived. This trend is evident when comparing the findings of McKinsey’s annual Global Survey of CIOs, CEOs and executives. In 2007 (the year before the financial crash), the ability to innovate was valued highly by 54% of respondents. In 2010, this figure was 84%. The bottom line is that after every major economic downturn, it is the companies that have continued to innovate that have survived.

Secondly, we must prepare for a post-silicon era. After decades of consistent improvement in the cost, capability and ubiquity of computing, equally steady progress in silicon technology (often referred to as Moore’s Law) means we have reached the point where ever more material constituents in transistors have shrunk such that quantum phenomena render them useless. The notion of everlasting generations of smaller, faster and less costly technology has run squarely into immutable laws of physics. Put succinctly, atoms don’t scale.

These factors have ensured the rebirth of innovation, with future progress in IT performance being realised through new system architectures and materials and emerging fields such as cognitive computing and its application to Big Data. But before going any further, some words of caution: It is essential to approach innovation as an engine that can take 30 years to start and three minutes to kill. Companies should acknowledge this before turning the key either way in the ignition.

Essential elements

There are some key ingredients required to enable sustained innovation within an organisation. The most important element of course is people. The main challenge however, is that a very different type of individual is required to manage innovation than would have been sought in the past – the days of relying on someone with four PhDs and three Masters, but who is unable to communicate their ideas to a wider audience, are gone.

Indeed, the type of innovator IBM looks for shares two basic properties: tremendous depth, in that they are exceptionally bright; and the ability to communicate their ideas broadly and operate as part of a team. In today’s commercial environment, innovators that are infinitely deep but have no breadth are limited in value, as they are unable to share their ideas or learn from their peers.

two types of innovator required for sustained innovation

There are also two types of innovator required for sustained innovation. There are the 'discontinuous innovators' that have the 'big ideas' and 'ah-ha!' moments, who every now and again invent something ground-breaking. At IBM for example, we have Robert Dennard, who invented the one device memory cell, more commonly known as Dynamic Random Access Memory (DRAM). Dennard also solved one of the conundrums about Moore’s Law by coming up with the right recipe for placing twice the number of transistors on each subsequent generation of chips of the same size, while keeping the power requirements constant from generation to generation.

Then there are the 'continuous innovators', these are the people who drive small and steady improvements that are nonetheless vital to sustained innovation. A good example is the team at IBM who drove decades of improvements in disk drive technology. If this type of sustained innovation had not taken place once the disk drive was invented, the average laptop today with an internal spinning hard disk would weigh about 250,000 tons.

In addition to a combination of types of innovator, you also need an infrastructure that enables these people to collaborate effectively on a global basis. Such collaboration is necessary across disciplines, organisations, markets and cultures. Put simply, you don’t get your best result when you have a single, homogenous effort.

Scaling versus Moore’s Law

information technology is reaching an impasse
Innovation is a matter of increasing importance because information technology is reaching an impasse. One of the core issues is that Moore’s Law has a logical end. If the industry continued to produce a silicon chip that is half the size every 18 months, then eventually it would be splitting atoms (and we all know what happens if you do that!). Already, the industry is fast approaching the physical limits of silicon technology. We are also reaching the limits of being able to scale technology out to ever larger systems and data centres.

data at the speed of light

Indeed, what is remarkable with innovation is that you have to revisit every assumption you have ever made about technology. Everyone thinks you don’t have to worry about light being too slow for a given operation, but in actual fact it is. Technology today is so fast that one of IBM’s chips performs a complete work cycle in the time it would take a beam of light to travel about 2cm. In the context of a data centre, where many machines can be working together on the same problem, but are located hundreds of metres apart, there can be vast delays because the optics employed to transmit signals between them isn’t fast enough.

This is why we need to find new ways of making communications faster. Ironically, the trick is to start innovating by inverting everything that’s been done before in order to improve things. This is where it gets really interesting. With 'chip stack' technology for example, it is possible to create an entire system by thinning individual chips to 50 microns, then stacking some 30-50 onto a single chip that is just a couple of millimetres high reducing communications distances within an individual system from metres to millionths of a metre. Light isn’t getting any faster, so you 'simply' make communication distances shorter.

Data, the new oil

Despite nearing its limits, silicon technology will still be employed for many decades yet. The difference is that the innovation will come from the integration of hardware, software, systems, and network functionality, compressing communication distances, as light is way too 'slow'. Fundamentally, after years of scaling out systems, the focus has come full circle and the architecture is now 'scale in' since you get huge benefits from proximity in both speed and power reduction.

At the same time, chip stacking presents huge technological and material challenges, not least in terms of powering and cooling. You now have 30-50 times the number of chips, and thus transistors, in an area and volume where there had once been only one, so this system in a single chipstack presents many issues yet to be fully resolved. Nevertheless, progress is being made with this technology and since the distance between memory and logic shrinks to microns as opposed to metres or more, the power required for signalling is reduced substantially. There is also some elegant work being done in with optics being integrated onto the chip stack, because conventional electrical signals would not be able to transfer data at the speeds or over the interchip distances as required.

Even data itself has become an issue, because information technology is generating ever-greater volumes of data that needs to be stored. More importantly, it must be used effectively. In the words of respected software industry entrepreneur and technology leader Ann Winblad, "Data is the new oil. In its raw form, oil has little value. Once processed and refined, it helps power the world."

data is now an asset

Data is now an asset, but it will call for a petabyte of storage class memory, or some vast new high-speed storage device to manage it, as well as a new suite of software tools – or 'analytics' – to understand it. Again, how we approach data requires an inversion of the conventional model of data management – ie. quantities of data are now so vast you cannot move it to a central compute engine, you must invert that process and have many compute engines operating on a vast central yet constantly evolving data store. Even more critical, over time one must innovate so as to analyse incoming data 'on the fly', and store only ones learnings, disposing of the vast raw data flow that might drown one over time.

Shooting for the moon

Integrating analytics tools with real-world data opens up great possibilities for improving lives, enabling problem-solvers to be proactive instead of reactive. For example, advances in data analytics makes it possible to predict the flow of traffic and combat congestion and traffic jams. In Singapore, models can now predict future traffic jams from current road conditions and electronic road pricing is implemented manage traffic patterns through toll prices and other measures and steer traffic into new flows to eliminate a jam that would have otherwise developed. One literally alters the future.

The uses for Big Data are literally boundless, but the IT industry and its partners must use these analytical tools to enable solutions for a smarter planet. Probably one of the biggest societal changes passed by virtually unnoticed in 2010: It was the first time in the history of mankind that more people lived in a city than in a rural area. Move forward 20 years and there will be almost 2 billion people migrating to major urban centres. This is a sobering thought, and why it is so important we get cities right.

Back in 2008, IBM’s chairmen Sam Palmisano said that the company would fund great ideas for a Smarter Planet to the tune of up to $100 million, challenging all employees to propose sustainable businesses serving the public good in key areas of challenge. Sustainability was key, as then society benefits on an ongoing basis. Setting such a grand challenge is the best way to drive a team to innovate and possibly the greatest example in modern history is the US space programme.

In 1961, President Kennedy challenged NASA to land a man on the moon and return him safely to earth before the end of the decade. NASA’s space programme saw the Apollo 11 astronauts realise President Kennedy’s dream on 20 July 1969. The Apollo space programme cost is estimated at $25.4 billion, about $150 billion in today’s money, but it ushered in an exciting new era of technological development and has delivered countless innovations of huge benefit to society – from freeze drying technology to the materials used in frying pans.

We’ve certainly come a long way since man first landed on the moon. The Apollo Guidance Computer had just 4k RAM and a 2MHz CPU; the average smart phone today has 256 MB of RAM and a dual core 1.2 GHz CPU. The rub is that the technology shrink that achieved these gains has almost run its course, so we need to find other ways to push things forward.



Dr Bernard Meyerson serves as Vice President of Innovation at IBM. He will be presenting the IET/BCS Turing Lecture at the Royal Institution, London on Monday 24 February, Cardiff University on Tuesday 25 February, The University of Manchester Wednesday 26 February, The University of Edinburgh, Thursday 27 February. The free-to-attend lecture is part of the IET’s Prestige Lecture Series, many of the lectures were established in memory of engineers who achieved exemplary and ground breaking work in their day. The speakers invited to give IET Lectures are of that calibre – innovative, forward-thinking and at the top of their game.

For more information, please visit: conferences.theiet.org/turing/index.cfm




Tweet article
BW on TwitterBW RSS feed