Steve Furber is not as well known as he should be, which is surprising given that he is one of the leading pioneers of personal computing.
As part of the key team at Acorn Computers in the early 1980s – the developers and manufacturers of the famed BBC Micro ( or Beeb as it was affectionately known) he was instrumental in designing the ARM (Acorn Risc Machine) chip which made the company’s hugely successful PCs almost twice as fast as anything else on the market.
And it is this innovation which underpinned the rapid growth in mobile communications, which has opened up economic opportunities for millions in the developing and developed world.
The ARM first appeared in the Acorn Archimedes in 1987, making Acorn the first company to ship Risc-based personal computers for the mass market.
Acorn founder Hermann Hauser has said that Steve Furber is ‘one of the brightest guys I’ve ever worked with – brilliant and when we decided to do a microprocessor on our own I made two great decisions – I gave them two things which National, Intel and Motorola had never given their design teams: the first was no money; the second was no people. The only way they could do it was to keep it really simple.’
Nearly 30 years on Steve Furber (now the ICL Professor of Computer Engineering in the School of Computer Science at the University of Manchester) is still working with ARM processors, although on a much grander scale than the humble Archimedes.
- RM:
- Steve, I think I’m right in saying that you were a member of the Cambridge University Processor Group, a club for computer hobbyists when you were a student there. Was this rather like the Homebrew clubs in the US? When were you bitten by the computer bug?
- SF:
- Yes, CUPG was very much a homebrew computer club, formed by Cambridge students. There the real men built their computers from TTL – only the wimps like me used microprocessors! I got “bitten by the bug” as a result of being drawn into CUPG, which I joined because I was interested in flying and flight simulators, and computers seemed a good way to build a flight simulator.
- RM:
- Was there anything that drew you into computers other than ‘I seem to be good at this”?
- SF:
- I guess it was a combination of my interest in flight simulators and my amateur electronics experience. I’d got rather put off building electronics in my teens because I struggled to make transistor circuits work (though I did get on better with valves!), but then I discovered the 741 op amp. As a Maths student the 741 gave me an abstraction I could work with, hiding all the low-level transistor details inside a clean black box. I built guitar effects boxes and two 8-channel sound mixing desks using 741s and PCBs I etched in my kitchen sink.
Digital electronics offered another clean abstraction that enabled me to build stuff that worked in a different domain – computers.
- RM:
- One of your first major projects was designing the BBC Micro, a machine designed to accompany a computer literacy programme set up by the BBC. Did you in your wildest dreams expect it to take off as it did?
- SF:
- We expected the BBC Micro to be a success, which is why we were so pleased to get the contract. But “success” meant selling the expected 12,000 machines. No-one anticipated the way home computers would take off in the early 1980s, to the extent that total Beeb sales were around 1.5 million.
The first sense I got that this thing might exceed our wildest dreams was when we were lined up to give a seminar at the (then) IEE Savoy Place. I think this was 1982. The main lecture theatre at Savoy Place seats several hundred, but three times the capacity turned up Coach-loads of people had come some distance, for example from Birmingham, to hear about the BBC Micro. A lot had to be sent away to avoid exceeding the safe capacity of the lecture theatre, and we were booked back to give the seminar two more times (and many other times around the UK and Ireland) to meet demand.
- RM:
- What do you think inspired people to crowd to see you and buy PCs in the numbers that they did?
- SF:
- I think there was a widespread realisation that home computing was coming, and it was going to be exciting, useful and fun. But the wider public was nervous about the great diversity of machines available, all produced by small companies they hadn’t heard of and found it hard to trust. Then in came the BBC Micro, bearing one of the most trusted names in the land. That was the signal they needed to take a step into the unknown.
Sure, the BBC Micro was a bit more expensive than competing machines, but if I’m buying a product I don’t fully understand I always prefer to pay a bit more for a name that I trust. And I like to think that the machine did live up to the brand expectations – it was solidly built (some Beebs survived ten years in the hands of primary school kids) and had sound educational credentials, attracting extensive educational support.
I still, frequently, come across folk who tell me that the BBC Micro introduced them to programming and was the foundation for their subsequent career.
- RM:
- Acorn had huge success in the late 1970s and saw its profits rise from £3000 in 1979 to £8.6m in July 1983 but it stumbled two years later and was later taken over by Olivetti. Do you think the company could have been saved had the ARM architecture project happened sooner?
- SF:
- Having ARM earlier wouldn’t have saved Acorn. ARM had to get out from the constrained Acorn market into the much more open System-on-Chip market that got them into mobile phones, and the SoC business only became technically feasible (with enough transistors on a chip to integrate all of the non-memory functions) in the 1990s.
- RM:
- When you were designing processors at Acorn they generally had the power consumption of less than a watt. Do you feel rather glum at the power demands of today’s high-end processors? Is this a consequence of the fact we don’t have a sufficient grip on building parallel software?
- SF:
- Yes, and yes! The energy-efficiency of computers is a growing concern, and the lengths we have gone to maximise single-thread performance at the cost of energy-efficiency are justifiable only in as forward now apart from going parallel, and even the high-end boys have thrown in the towel on single-thread performance. They are selling us multicore processors we still don’t know how to use. Once we do know how to exploit parallelism there will be no need for high-end processors at all, because we will be able to get the same performance with much greater energy-efficiency by using larger numbers of simpler processors. I expect to see this transition soon in data centres, where the load has a lot of easily accessible parallelism and where energy concerns are already at the top of the agenda, and even in high-performance computers where I see this as the most promising route to exascale.
- RM:
- You’re working now on the SpiNNaker project that you’re leading at Manchester which aims mimic the complex interactions in the human brain. What’s the higher motivation with the project and this in any way come from Doug Engelbart’s vision about augmenting the human mind and interaction between machine and its user, which led directly to the invention of the PC?
- SF:
- The “higher motivation” for SpiNNaker is the observation that computers aren’t the only information processing systems on the planet, and they aren’t even the best at some tasks. But we still don’t know how the other sort – biological brains – work. This seems to me to be a fundamental gap in scientific knowledge. Computers are now approaching the performance required to build real-time models of brains (but they aren’t quite there yet – a computer model of a human brain would require at least an Exascale machine), so can we accelerate the understanding of the brain by designing a computer that is optimised for this task? This will then offer a platform for neuroscientists, psychologists and others to develop and test hypotheses on a new scale.
Scale is important. We usually like to start small, get some understanding, and then scale up building on this. But there are some places, including some ideas in the neural network field, where starting small doesn’t work. There are good theoretical insights into why this should be, relating to the counter-intuitive properties of high-dimensional spaces. The maths simply stops working if the problem is below a certain (large) size. So sometimes you have to jump in at the deep end, and SpiNNaker offers a very deep pool to do this in.
- RM:
- You’re working with the Royal Society to figure out why the number of students taking computing classes has halved in the past eight years, what are the most important but fundamental things, the computing industry can learn from its past? Do you think that the industry has been wrong about what computing is and where it should go and how to improve it?
- SF:
- I’ll be able to answer this better when the study has drawn its conclusions. But on the evidence of other studies of this area, the problem seems to lie in the transition from the computer as a universal programmable platform for exploring ideas (as with the BBC Micro in the 1980s) to an office tool that runs productivity software. Much of what is taught in schools is IT rather than computer science. IT is important but intellectually unchallenging, and often dull. It’s as if all that was taught in Maths was arithmetic, or in English spelling. IT, arithmetic and spelling are all important skills, but there is *so* much more in all these subjects.
- RM:
- Lots of people have tried to come up with languages or programming systems that allow non-programmers to program. Is that a doomed enterprise?
- SF:
- Currently programming is an extremely demanding discipline, requiring mental abilities from scaling multiple levels of abstraction to chasing very low-level details around at the bottom of a vast software system to track down a bug. I don’t think there is any way you can train the entire population to become skilled programmers at this level – it is a peculiar skill, not unlike being a theoretical physicist in its requirement to think abstractly while paying painstaking attention to detail.
If the goal is to introduce a wider audience to the ideas in programming – “computational thinking” – then there may be scope for a simpler language that is less universal than those used by professional programmers, has fewer “death traps” for the unwary, and is perhaps more visual than symbolic in its representation.
I always thought BBC BASIC was an excellent introductory language, but I just get laughed at when I suggest it these days! - RM:
- Over the last 20 years the Internet has scaled in growth but operating systems and other computer software hasn’t grown exponentially. Do you think that the internet concept could be imitated and used as a basis for an operating environment that doesn’t have an operating system?
- SF:
- It seems to me that operating systems have grown exponentially, in memory requirement if not in functionality!
I’m not sure how this relates to the question, but we have seen cycles of shift of computing power between central resources and the user at the periphery. With cloud computing we are seeing a shift back towards the centre, partly driven by improving communication services and partly by the need to mobilise the user terminal device. This is seeing the PC give way to the smart phone and iPad-like terminal, which moves quite a lot of the operating system functionality up into the Cloud.
- RM:
- Do you think many people working in technology are unaware of its history and have little curiosity about where languages came from?
- SF:
- Technology tends to attract folk who are more interested in the future than the past, so they often have very little sense of the history of their subject, and very little curiosity. But as folk get older their horizons get wider, and I think most technologists in the 2nd halves of their careers develop some sense of the historical path that has led to the way things are today.
- RM:
- Do you feel we’re progressing in technology even though sometimes it seems that we are leaping backwards?
- SF:
- We are definitely making progress in technology, faster than ever. After over 30 years in the business I still find new products astonishing. I carry my entire CD collection on my iPhone, but I remember a time at Acorn when we debated whether solid state music would ever be economically feasible. The iPad is a long-standing dream come true – again, back in the Acorn days we talked about similar products for schools (though without the connectivity, which was inconceivable then), but of course the technology just wasn’t ready.
- RM:
- When you look back at your career on all the things you have done is there one time or a period that stands out among all the others?
- SF:
- I guess my early years at Acorn would have to stand out – from 1981 to 1985 – since that period covers the BBC Micro and the first ARM chip, and those are the foundations of my subsequent career. The success of the BBC Micro was tangible at the time, as described above, whereas the ARM was a long time coming to fruition, and required a great deal of work by a lot of other folk, not to mention a fair dose of serendipity, to get to the 20 billion total shipments to date.
But it’s knowing that this scale of impact is possible that drives me on and determines the directions I choose to take my research today. SpiNNaker has the potential to generate similar impact, though there are many contingencies and, as with all research, it’s highly speculative.
Load comments