Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/index.php?threads/the-uk%E2%80%99s-semiconductor-industry-is-dying.16604/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2021370
            [XFI] => 1050270
        )

    [wordpress] => /var/www/html
)

The UK’s semiconductor industry is dying

Daniel Nenni

Admin
Staff member
Successive waves of foreign takeovers have accelerated the decline of the once leading sector.

In 1952 Geoffrey Dummer, a British engineer working for the Ministry of Defence, was challenged with finding ways to improve the radar systems his team had pioneered during the Second World War. One of Dummer’s proposals was to lay an electrical circuit on a layer of silicon to increase the radars’ reliability. It quickly became clear to him, however, that the effects of the circuit would reach far beyond the navigation systems he was attempting to finesse.

Dummer presented the concept to MoD officials, business leaders and then to the US Electronic Components Symposium in Washington DC. “I shook the industry to the bone when I lectured on the chip,” he said in an interview two decades later. “I was trying to make them realise how important its invention would be for the future of microelectronics and the national economy.”

It wasn’t the British government, nor its defence research institutes, that would go on to develop Dummer’s proposals. It was an American company, Texas Instruments, that seven years later would patent the world’s “first” integrated circuit, the foundational technology upon which the electronics industry and much of the modern world has been built. Semiconductor chips are now used to manage the flow of electricity in phones, computers, cars and household goods.

The origin story of the semiconductor industry should have served as a cautionary tale in Britain. Over the subsequent decades, British scientists and engineers made some of the most significant breakthroughs in the sector. Yet, as with the integrated circuit, it has largely been overseas businesses that have commercialised that research. Even those companies that have successfully monetised British semiconductor advances, such as Arm and Imagination Technologies, have been acquired by foreign owners in recent years.

Two stories have drawn this issue into sharp focus since the start of the pandemic. The first is the shortage of computer chips, which has exposed the fragility of global supply chains. As office workers returned home during lockdowns, there was a surge in demand for laptops, tablets and thus the semiconductors that underpin them. Disruption to freight routes and ports caused by the pandemic had, however, constrained supply of these components, pushing up prices and causing long waiting times for goods, including fridges and dishwashers as well as cars, computers and smartphones......

 
When I was studying computer engineering in the 1980s, perhaps the most influential product on my thinking and interests was the Inmos Transputer. So far ahead of its time, and notably far ahead of the design and fabrication technologies of the day, the Transputer concept didn't have the impact on computer architecture and processor design that one would have expected (or hoped). In the 1970s and most of the 1980s computer engineering had a trickle-down path to broad applications from mainframes and supercomputers. The advent of IBM PCs and Intel's microprocessors turned the trickle down on its head, to a gusher from the mass markets below that eventually defined servers and supercomputers. When I see modern microprocessors with CXL or PCIe fabrics linking them together into local networks, I am still reminded of my first reaction to reading about the Transputer.


 
When I was studying computer engineering in the 1980s, perhaps the most influential product on my thinking and interests was the Inmos Transputer. So far ahead of its time, and notably far ahead of the design and fabrication technologies of the day, the Transputer concept didn't have the impact on computer architecture and processor design that one would have expected (or hoped). In the 1970s and most of the 1980s computer engineering had a trickle-down path to broad applications from mainframes and supercomputers. The advent of IBM PCs and Intel's microprocessors turned the trickle down on its head, to a gusher from the mass markets below that eventually defined servers and supercomputers. When I see modern microprocessors with CXL or PCIe fabrics linking them together into local networks, I am still reminded of my first reaction to reading about the Transputer.



Agreed. In the 1980s/90s I worked for a company that made boards with transputers on them that fit into Sun Microsystems chassis. They were for scientific computing. Great concept but the software required to make them go did not get there. Software has always been the key to hardware.
 
The Govt of the time gave LG a big pile.of cash to come to South Wales , alas global slowdown meant the FAB never even open, alongside the Newport Wafer Fab maybe that region could have been something.

Atmel came and went in Tyneside.

Mitel come , then it split into Zarlink n Xfab , I dont think either manufacture in UK anymore.

Silicon Glen came and went in Scotland , no doubt UK still have plenty of folk on R n D side , but physical side of the industry isnt there anymore.
 
Successive waves of foreign takeovers have accelerated the decline of the once leading sector.

In 1952 Geoffrey Dummer, a British engineer working for the Ministry of Defence, was challenged with finding ways to improve the radar systems his team had pioneered during the Second World War. One of Dummer’s proposals was to lay an electrical circuit on a layer of silicon to increase the radars’ reliability. It quickly became clear to him, however, that the effects of the circuit would reach far beyond the navigation systems he was attempting to finesse.

Dummer presented the concept to MoD officials, business leaders and then to the US Electronic Components Symposium in Washington DC. “I shook the industry to the bone when I lectured on the chip,” he said in an interview two decades later. “I was trying to make them realise how important its invention would be for the future of microelectronics and the national economy.”

It wasn’t the British government, nor its defence research institutes, that would go on to develop Dummer’s proposals. It was an American company, Texas Instruments, that seven years later would patent the world’s “first” integrated circuit, the foundational technology upon which the electronics industry and much of the modern world has been built. Semiconductor chips are now used to manage the flow of electricity in phones, computers, cars and household goods.

The origin story of the semiconductor industry should have served as a cautionary tale in Britain. Over the subsequent decades, British scientists and engineers made some of the most significant breakthroughs in the sector. Yet, as with the integrated circuit, it has largely been overseas businesses that have commercialised that research. Even those companies that have successfully monetised British semiconductor advances, such as Arm and Imagination Technologies, have been acquired by foreign owners in recent years.

Two stories have drawn this issue into sharp focus since the start of the pandemic. The first is the shortage of computer chips, which has exposed the fragility of global supply chains. As office workers returned home during lockdowns, there was a surge in demand for laptops, tablets and thus the semiconductors that underpin them. Disruption to freight routes and ports caused by the pandemic had, however, constrained supply of these components, pushing up prices and causing long waiting times for goods, including fridges and dishwashers as well as cars, computers and smartphones......

Did you leave out the second story ?
 
Successive waves of foreign takeovers have accelerated the decline of the once leading sector.

In 1952 Geoffrey Dummer, a British engineer working for the Ministry of Defence, was challenged with finding ways to improve the radar systems his team had pioneered during the Second World War. One of Dummer’s proposals was to lay an electrical circuit on a layer of silicon to increase the radars’ reliability. It quickly became clear to him, however, that the effects of the circuit would reach far beyond the navigation systems he was attempting to finesse.

Dummer presented the concept to MoD officials, business leaders and then to the US Electronic Components Symposium in Washington DC. “I shook the industry to the bone when I lectured on the chip,” he said in an interview two decades later. “I was trying to make them realise how important its invention would be for the future of microelectronics and the national economy.”

It wasn’t the British government, nor its defence research institutes, that would go on to develop Dummer’s proposals. It was an American company, Texas Instruments, that seven years later would patent the world’s “first” integrated circuit, the foundational technology upon which the electronics industry and much of the modern world has been built. Semiconductor chips are now used to manage the flow of electricity in phones, computers, cars and household goods.

The origin story of the semiconductor industry should have served as a cautionary tale in Britain. Over the subsequent decades, British scientists and engineers made some of the most significant breakthroughs in the sector. Yet, as with the integrated circuit, it has largely been overseas businesses that have commercialised that research. Even those companies that have successfully monetised British semiconductor advances, such as Arm and Imagination Technologies, have been acquired by foreign owners in recent years.

Two stories have drawn this issue into sharp focus since the start of the pandemic. The first is the shortage of computer chips, which has exposed the fragility of global supply chains. As office workers returned home during lockdowns, there was a surge in demand for laptops, tablets and thus the semiconductors that underpin them. Disruption to freight routes and ports caused by the pandemic had, however, constrained supply of these components, pushing up prices and causing long waiting times for goods, including fridges and dishwashers as well as cars, computers and smartphones......

Daniel, I'm not at all convinced by this narrative (note: I've spent 35 years in microelectronics - mainly in the the UK).

1. When was the UK really a leader in this sector ?

There have been some notable innovations - Ferranti invented the gate array, the transputer was mentioned, ARM and Imagination IP - but never any commercial dominance.

2. The New Statesman is hardly an authority on any technical subject.

The NS is is left-wing news (that mainly means "opinion" these days) magazine that has a political agenda to push. I can't read the full article (paywall), but I know exactly the sort of line they will be pushing. No one in these opinion magazines has any knowledge or experience of electronics. The author - Oscar Williams - is a journalism graduate - not a technology one. Has never worked in tech.

The articles in these sorts of magazines always start out from a false assumption or statement ... non-tech people never state their assumptions at the outset or quanitify the uncertainties or error bars ...

3. You provide no evidence that there is a "decline".

A quick search suggests that employment in microlectronics is stable over the past decade (not great, but not a decline). There is absolutely a relative decline vs the US, but that's another matter. Within Europe, the UK is relatively strong.

4. You provide no evidence that foreign ownership has reduced employment.

In fact, foreign ownership and management revitalised many UK industries (the car industry in the 1980s and 1990s being a famous example when the Japanese taught us how to do iit properly). That's leaving aside the question of what "national ownership" actually means when a stock is listed any anyone, anywhere can buy the shares.

I am unconvinced that the way microelectronics has developed in the UK in the last 35 years is either bad or a disaster. The management is far better than the old, large British companies (largely defence ones - GEC, Plessey, Racal, Ferranti, Thorn EMI, etc). The opportunities are far greater and there is a shortage of good graduates - many European graduates come to work in the UK industry as the opportunities are better here. The pay and working conditions are way better than 35 years ago.

What has declined is the local demand for the chips - Europe has lost both telecom switching (largely to China) and mobile handsets (to the Far East and US) during the last 20 years. But that's another story.

Please don't let a New Statesman hit piece about Liz Truss (remember - there's a leadership election for Tory leader and PM going on and the NS hate the Tories) cloud the facts.

The UK has certainly made some mistakes - I never understood why ARM was sold to SoftBank (nor why if nVidia was "bad" that SoftBank was "better" !). But I think the open, international model is generally to be preferred to the more protectionist one on offer across the Channel.
 
Back
Top