130 likes | 143 Vues
The NetFPGA10G accelerates hardware networking research, supporting cutting-edge technologies for faster prototypes. Explore its high-speed interfaces and memory subsystems now!
E N D
NetFPGA10G Team • Michaela Blott • Adam Covington • Jonathan Ellithorpe • Paul Hartke • John Lockwood • Nick McKeown • Sachidanandan Sambandan • Kees Vissers • Tatsuya Yabe • James Zeng Page 2
Motivation/ Objective • Leverage LATEST technology to create a networking reference platform • Enable researchers, students and customers to build working prototypes of hardware-accelerated networking systems • Upgrade interface speeds and components to facilitate faster and more complex prototypes • Successor to the highly successful networking research platform NetFPGA - used by over 500 universities worldwide
V2 QDRII SRAM subsystem RLDRAMII CIO DRAM subsystem PCIe x8, Gen1 interface High-speed serial expansion interface 1G or 10G Ethernet subsystem 1G or 10G Ethernet subsystem 1G or 10G Ethernet subsystem 1G or 10G Ethernet subsystem Xilinx Virtex5 XCV5TX240T-2 FG1759 High-Level Specification • PCIe adapter full size card, dual slot with extra ATX power connector • Standalone operation supported Page 4
Memory Interface -SRAM Subsystem • QDRII, 300MHz, CY7C1515JV18-300BZXC • overall density 216Mb • organized in 3 independent 36bit interfaces • 64.8Gbps raw access for read and 64.8Gbps for write @ 300MHz • sized to handle flow classification, routing table lookup, flow statistics, and free and used lists SRAM dout SRAM dout SRAM dout adr, ctl din adr, ctl din adr, ctl din 36bit 36bit 36bit 36bit 36bit 36bit Page 5
Memory Interface -DRAM Subsystem • RLDRAMII CIO, 250-300MHz, MT49H16M36HT-25:A • overall density 2.3Gb (57msec storage of 40Gbps traffic) • arranged as 2 independent 64bit interfaces (2 components per interface) • 76.8Gbps shared raw access for read and write @ 300MHz • sized to handle packet buffering RLDRAMII CIO RLDRAMII CIO adr, ctl data adr, ctl data 2*32=64bit 2*32=64bit Page 6
Network Interface Flavours • PHY: four single-port PHYs (NetLogic AEL2005) • 10G interfaces or 1G interfaces via SFP+ cages • Supported physical standards: • 10G direct attach copper, and optical interfaces 10G LRM, 10G SR, 10G LR • 1000BaseT and 1000BaseX FPGA XAUI (4GTXs @3.125Gbps 10G PHY SFI SFP+ cage 10G PHY SFI XAUI (4GTXs @3.125Gbps SFP+ cage SFI XAUI (4GTXs @3.125Gbps 10G PHY SFP+ cage SFI 10G PHY SFP+ cage XAUI (4GTXs @3.125Gbps Page 7
Expansion Connector • purpose: • allow greater port density, • support different physical interface flavours • add daughter cards with extra memory cards or NSEs/ KBPs • 2 flavours: • daughter card expansion • board to board with expansion cable • connector: • Two QTH Samtec connectors • Designed for XAUI, PCIe, SATA, Infiniband, HyperTransport, etc. • board to board cable: • Twinax micro-ribbon cable • Available in different lengths: for example 0.5m (100 Gbps);1m (50 Gbps) • recommended part number: HQDP-020-05.00-STL-SBR-2 distance vs speed: FPGA Samtec connector 10GTXs @6.25Gbps Samtec connector 10GTXs @6.25Gbps Page 8
NetFPGA1G – 10G comparison In a nutshell... Page 9
FPGA Designs • Stress-test design • prove the board is functional under maximum load (temperature, power, noise) • Basic & full infrastructure designs • provides basic RTL infrastructure to all users in form of well-defined and abstracted interfaces • Planned reference designs • PCIe-ICAP design (allows configuration of the FPGA via PCIe) • Stanford reference router design • HLS platform • and others... Page 10
More information ... • Alpha Program • Pricing is still being negotiated – heavily subsidized for academics as Micron, Cypress, NetLogic Microsystems and Xilinx are donating the parts • Board was built in collaboration with Hitech Global, who will also distribute the board: (for academic and commercial use) • Visit HiTechGlobal’s website • http://www.hitechglobal.com/Boards/PCIExpress_SFP+.htm • Status: Release to Manufacture in the next weeks • Visit NetFPGA website for the latest update • http://www.netfpga.org • If you haven’t already, sign up to netfpga-announce mailing list • https://mailman.stanford.edu/mailman/listinfo/netfpga-announce Page 11 Page 11 Page 11
THANK YOU! mblott@xilinx.com Page 12
DemoStress-test Design Host CPU QDR memory test RLDRAM memory test FPGA PCIe test PCIe Application & Driver SFP+ test Filler Config & Flash test Clock & PWR test UART report Hyperterminal Config & Flash test CPLD NetFPGA10G board Test computer