14 Hardware
14.1 Base-station resources
Each Powder base station site includes a collection of software defined radio devices from National Instruments (NI) connected to Commscope and Keysight antennas. A handful of sites (one now, and three later) include massive MIMO equipment from Skylark Wireless. Each device has one or more dedicated 10G links to aggregation switches at the Fort Douglas aggregation point. These network connections can be flexibly paired with compute at the aggregation point or slightly further upstream with Emulab/CloudLab resources.
- Antennas (connectivity defined in SDR equipment section)
- 1 x 10-port Commscope VVSSP-360S-F multi-band
360 horizontal beamwidth, ~20 degree vertical
8.2 dBi gain from 2.3 - 2.69 GHz
4.9 dBi gain from 3.4 - 3.8 GHz
4-port MIMO-capable Cellular elements (1695 - 2690 MHz)
4-port MIMO-capable CBRS elements (3400 - 3800 MHz)
2-port 5 GHz elements (5150 - 5925 MHz)
- 1 x 1-port broadband Keysight N6850A omnidirectional
50 - 6000 MHz
- NI SDR Equipment
Note: The current configuration is described below. This is expected to change in the coming months as NI N310 units are brought online and RF front-end circuits are completed and deployed (ETA Sept. 2019).
- 2 x USRP X310 with UBX160 daughtercards
Channel ’A’ TX/RX and RX2 ports of device 1 through a frontend to a Cellular port of VVSSP-360S-F
Channel ’A’ TX/RX port of device 2 on a CBRS port of VVSSP-360S-F
RF front-end for Band 7 cellular providing FDD, reduced noise figure and 4W maximum (saturated) power
No add-on RF front-ends for gain/functionality yet in CBRS
2 x 10 Gbe backhaul links
Keysight (broadband) antenna currently unavailable.
Skylark Wireless Massive MIMO Equipment
Skylark equipment consists of chains of multiple radios connected through a central hub. This hub is 64x64 capable, and has 4 x 10 Gbe backhaul connectivity. Powder provides "big iron" compute (Dell d840 nodes, see compute section) for pairing with the massive MIMO equipment.
Note: Available at Merrill Engineering Building site
- 32 x IRIS-030-D 2x2 transceiver radios
Includes IRIS-FE-03-CBRS front-end modules
BRS and CBRS capable: 2555 - 2655, 3550 - 3700 MHz
26 dBm, 2 x 2 TDD
- Connected to dual-polarized antenna elements
100 degree beamwidth, 5.5 dBi
- 1 x FAROS-ENC-05-HUB aggregation hub
Provides power and connectivity to all IRIS SDRs
- The 32 SDRs are connected in six chains to hub
4 chains have 4 Iris devices each
2 chains have 8 Iris devices each (extended chains)
13.2 Gbps connectivity across individual chains
Trenz TE0808 SOM with XC7U9EG MPSoC
Upcoming changes: By Spring 2020, Powder base station sites will include 4 x NI SDR devices each; two USRP X310 SDRs (with UBX160 daughtercards), and two USRP N310 SDRs. Custom, band/radio-specific RF front-ends will be included. Each SDR will be dedicated to a particular band: One N310 for 4 x 4 Cellular access (Commscope VVSSP-360S-F); One N310 for 4 x 4 CBRS access (VVSSP-360S-F); One X310 for 2x2 5 GHz band access (VVSSP-360S-F); One X310 for broadband access (Keysight N6850A). The custom RF front-ends will have PA and LNA components with 2W peak power, and both TDD and FDD paths available. The particulars of these front-ends will be made available as the designs are completed and tested.
14.2 Dense-deployment Base-station resources
Each Powder dense deployment base station location contains an NI B210 SDR connected to a Commscope antenna and a small compute node connected via fiber to the Fort Douglas aggregation point.
- Antenna (connectivity defined in SDR equipment section)
- 1 x 10-port Commscope VVSSP-360S-F multi-band
360 horizontal beamwidth, ~20 degree vertical
8.2 dBi gain from 2.3 - 2.69 GHz
4.9 dBi gain from 3.4 - 3.8 GHz
4-port MIMO-capable Cellular elements (1695 - 2690 MHz)
4-port MIMO-capable CBRS elements (3400 - 3800 MHz)
2-port 5 GHz elements (5150 - 5925 MHz)
- NI SDR Equipment
Note: The current configuration is described below. This is expected to change in the coming months as an NI N310 unit is deployed in place of the B210.
- NI USRP B210 SDR on compute node
Channel ’A’ TX/RX port on a CBRS port of VVSSP-360S-F
Connected via USB 3.0 to compute host (described below)
- Intel Compute Node
- 1 x Neousys Nuvo-7501 Coffee Lake ruggedized computer
Intel Core i3-8100T, 4 cores, 3.1GHz
32 GB wide-temperature range RAM
512 GB wide-temperature range SATA SSD storage
The compute node is connected via two 1Gbps connections to a local switch which in turn uplinks via 10Gb fiber to the Fort datacenter. The two connections are:
A 1 Gbps Ethernet “control network”—
this network is used for remote access, experiment management, etc., and is connected to the public Internet. When you log in to the node in your experiment using ssh, this is the network you are using. You should not use this network as part of the experiments you run in Powder. A 1 Gbps Ethernet “experimental network”. This interface can be connected with other Powder or Cloudlab nodes via the Fort datacenter switches.
The two S5248F-ON switches are connected to a third S5248F-ON "aggregation" switch via 2 x 100GbE links each. All three switches host 10Gb connections from the eight roof-top base-stations. The aggregation switch also hosts 100Gb uplinks to the MEB and DDC datacenters that contain further Powder resources and uplinks to Emulab and CloudLab.
14.3 Fixed-endpoint resources
Each Fixed Endpoint (FE) installation in POWDER contains an ensemble of software defined radio (SDR) equipment from National Instruments (NI) with complementary small form factor compute nodes. There is no wired backhaul, though the platform does provide seamless access via cellular/WiFi to resources in an FE installation. Endpoints are mounted at human height level on the sides of buildings.
- Antennas
- 1 x Taoglas GSA.8841 wideband I-bar antenna
698 - 6000 MHz frequency range
Approximately -2 dBi average gain across range
- NI SDR Equipment
Note: The current configuration is described below. This is expected to change in the coming months as RF front-end circuits are completed and deployed.
- 1 x NI USRP B210 SDR on nuc1
Channel ’A’ RX2 port connected to dedicated GSA.8841 antenna
Connected via USB 3.0 to NUC host (described below)
- 1 x NI USRP B210 SDR on nuc2
Channel ’A’ connected to Band 7 FDD frontend
RF front-end provides FDD, reduced noise figure and 4W maximum (saturated) power
Connected via USB 3.0 to NUC host (described below)
- Intel NUC Compute
- 1 x Intel NUC8i7BEH small form factor PC
Intel Core i7-8559U
32 GB RAM (2 x 16GB Corsair Vengeance 2400 MHz DDR4 SODIMM)
250 GB NVMe storage (Kingston)
14.4 Mobile-endpoint resources
Each Mobile Endpoint (ME) installation in Powder contains two software defined radio (SDR) units from National Instruments (NI) with one small form factor compute node. As with fixed-endpoints, there is no wired backhaul, though the platform does provide seamless access via cellular/WiFi to resources in an ME installation. Endpoints are mounted inside of campus shuttle buses on a rear shelf with antennas attached to a rear window.
- Antennas
- 1 x Taoglas GSA.8841 wideband I-bar antenna
698 - 6000 MHz frequency range
Approximately -2 dBi average gain across range
- NI SDR Equipment
- 1 x NI USRP N300 SDR on ed1
Note: The N300 SDR is currently unavailable.
Connected via 2x10Gb Ethernet directly to Supermicro host (described below)
- 1 x NI USRP B210 SDR on ed1
Channel ’A’ connected to ...
Connected via USB 3.0 to Supermicro host (described below)
- Supermicro Compute Node
- 1 x Supermicro Mini-ITX E300-8D SuperServer
Intel Xeon processor D-1518 SoC (4-core, 2.2GHz)
64 GB RAM (2 x 32GB Hynix 2667MHz DDR4 DIMMs)
480 GB Intel SSDSCKKB480G8 SATA SSD
2 x 10Gb Intel SoC Ethernet (both connected to N300 SDR)
14.5 Skylark IRIS Endpoints
Skylark IRIS SDRs are deployed as endpoints in two eastern facing offices of the Merrill Engineering Building. These are connected via 1 Gbps copper links.
- Skylark SDR Equipment
- 1 x IRIS-030-D
- Includes IRIS-FE-03-CBRS front-end modules
BRS and CBRS capable: 2555 - 2655, 3550 - 3700 MHz
26 dBm, 2 x 2 TDD
14.6 Near-edge computing resources
The following resources are connected to the Powder base-stations via 100Gb links in an aggregation point at the Fort Douglas datacenter.
d740 |
| 16 nodes (Skylake, 24 cores) |
CPU |
| 2 x Xeon Gold 6126 processors (12 cores, 2.6Ghz) |
RAM |
| 96GB Memory (12 x 8GB RDIMMs, 2.67MT/s) |
Disks |
| 2 x 240GB SATA 6Gbps SSD Drives |
NIC |
| 10GbE Dual port embedded NIC (Intel X710) |
NIC |
| 10GbE Dual port converged NIC (Intel X710) |
d840 |
| 3 nodes (Skylake, 64 cores) |
CPU |
| 4 x Xeon Gold 6130 processors (16 cores, 2.1Ghz) |
RAM |
| 768GB Memory (24 x 32GB RDIMMs, 2.67MT/s) |
Disks |
| 240GB SATA 6Gbps SSD Drive |
Disks |
| 4 x 1.6TB NVMe SSD Drive |
NIC |
| 10GbE Dual port embedded NIC (Intel X710) |
NIC |
| 40GbE Dual port converged NIC (Intel XL710) |
All nodes are connected to two networks:
A 10 Gbps Ethernet “control network”—
this network is used for remote access, experiment management, etc., and is connected to the public Internet. When you log in to nodes in your experiment using ssh, this is the network you are using. You should not use this network as part of the experiments you run in Powder. A 10/40 Gbps Ethernet “experimental network”. Each d740 node has two 10Gb interfaces, one connected to each of two Dell S5248F-ON datacenter switches. Each d840 node has two 40Gb interfaces, both connected to the switch which also hosts the corresponding mMIMO base station connections.
The two S5248F-ON switches are connected to a third S5248F-ON "aggregation" switch via 2 x 100GbE links each. All three switches host 10Gb connections from the eight roof-top base-stations. The aggregation switch also hosts 100Gb uplinks to the MEB and DDC datacenters that contain further Powder resources and uplinks to Emulab and CloudLab.
14.7 Cloud computing resources
In addition, Powder can allocate bare-metal computing resources on any one of several federated clusters, including CloudLab and Emulab.