I’ve been reading a presentation from Sharkfest 2012 where a engineers from Microsoft are presenting on their
Microsoft’s Demon – Datacenter Scale Distributed Ethernet Monitoring Appliance. The whole presentation is interesting but this particular slide caught my attention:
This suggests that Merchant Silicon (from companies like Intel, Broadcom and Marvel) will reach 256 ports of 10GbE Ethernet by 2015. Now it’s really hard to get hold of a roadmap for merchant silicon, it’s tightly controller & secretive so this is best I can find on merchant silicon futures.
What does this mean ? Existing top of rack switches with 48 x 10Gb ports have 64 x 10GbE ports internally, the other 16 x 10GbE are usually 4x40GbE.
This are likely to be replaced by 256 x 10GbE silicon with a likely configuration with 64 x 40GbE configuration. Each 40GbE interface can either be a 4 x 10GbE or 1 x 40GbE because of the way that 40GbE PHY is setup.
A QSFP is highly likely to cheaper than four 10GBaseSR SFP modules in my opinion. Although efforts to Short Range transceivers for Top of Rack in on the way, I’m doubtful that it will get much traction in the market. tweet
The EtherealMind View
I figure that the following ideas become a bit clearer.
- 40 GbE will arrive for Top of Rack solutions in 2016
- Switches in the campus backbone and aggregation layers should be ready for replacement / upgrading in 2016 to support 40GbE
- Do not install any cabling in your data centre or campus backbone. 40GbE uses 8 fibre cores for multimode and 1 pair for single mode. The cable will be OM4 although OM3 will have shorter distances. Provision the least amount of cable until new cabling solutions arrive.
- Spending money on expensive 10GbE switches will be wasted as they are likely to be replaced in 2016 with 40 GbE. Most server people are already deploying/asking for 4 x 10GbE per chassis and it probably be cheaper to use a 40GbE QSPF than four 10Gig SFP modules in two to three years time.
I have nothing to disclose in this article. My full disclosure statement is here