A tale of two microchips
OK, I was wrong. My rationale focused too much on Intel's present difficulties and techology. Apple is moving to Intel processors, the rationale behind it is being endlessly debated in the media but there was some interesting strands that I wanted to pull out:
Essentially what all these points boil down to is this: the process of making micro-chips is becoming more and more expensive. This is reason why the electronics industry has seen the rise of the 'fabless' chip company over the past ten years. Once you have the engineers, designing a chip is relatively inexpensive with the rise of powerful, cheap off-the-shelf computer workstations with specialist software. For ASIC designs, some of the software is available as a free download. However, pushing the bounds of physics to get more components on a smaller piece of silicon to work properly requires a huge amount of money. The ability to manufacture microchips is becoming ever more concentrated in the hands of fewer players. Apple never represented more than 3 per cent of IBM's chip fabrication capacity according to reports I have seen online, so it was a no-brainer for them to lose Apple and focus on larger volume players like games consoles.
Interesting questions:
OK, I was wrong. My rationale focused too much on Intel's present difficulties and techology. Apple is moving to Intel processors, the rationale behind it is being endlessly debated in the media but there was some interesting strands that I wanted to pull out:
- At the present time the PowerPC is a great architecture, none of this is about the chip design, but about investment in chip manufacture. Jobs in his keynote focused on the suppliers product roadmaps and computational performance versus energy consumed (and dissipated). Intel is looking to make powerful efficient microprocessors in the future, suitable for mobile applications. IBM wants to become a one-stop shop for ASIC design and manufacture
- The Power architecture will continue in some of the world's best servers and mainframes, vindicating its microchip heart. But then you don't need to have a mainframe sitting on your lap while you read your email on a commuter train
- IBM stopped development of Apple's processors because the business was unprofitable
- Sony tried to sell Apple on their Cell processor and failed. Jobs is supposed to have not liked the architecture. Moving the OS over to a new concept like Cell would have been a bet the farm move that would have given him little room to move. Sony would have liked to have Apple on board to increase capacity in their own chip fabrication plants
Essentially what all these points boil down to is this: the process of making micro-chips is becoming more and more expensive. This is reason why the electronics industry has seen the rise of the 'fabless' chip company over the past ten years. Once you have the engineers, designing a chip is relatively inexpensive with the rise of powerful, cheap off-the-shelf computer workstations with specialist software. For ASIC designs, some of the software is available as a free download. However, pushing the bounds of physics to get more components on a smaller piece of silicon to work properly requires a huge amount of money. The ability to manufacture microchips is becoming ever more concentrated in the hands of fewer players. Apple never represented more than 3 per cent of IBM's chip fabrication capacity according to reports I have seen online, so it was a no-brainer for them to lose Apple and focus on larger volume players like games consoles.
Interesting questions:
- Why has Sun's SPARC architecture or Fujitsu's flavour of it not come up in discussions. Sun would benefit from having an addtional customer for the SPARC architecture and the RISC to RISC move would have been well received in the Mac community?
- When can we expect to see the first PC rejecting a chip architecture and moving to FPGAs instead? This is already happening in specialist electronics that do complicated computations in areas such as video processing.