Time for my favorite talk of #39c3, the infrastructure review. It truly shows the scale and effort evolved to put the congress together, from networking, communication, video recoding / streaming, power co-ordination, angels and much more. It demonstrates the saying "it takes a village" to put a community event together.
🧵
The power team laid 20655 cables which span 22km of cables which weighed will be 3 tons of copper. They brought 209 flight cases and deployed 90 PDUs of which 25 were stagesmart models. They also tested RCDs for trip times.
#39c3
Congress is powered by a 10kV ring through 2 transformers. There is an emergency "thomas the tank engine" generator for emergency power. #39c3
The power is actively monitored and the team gets alerts if the power goes out. Grafana monitors the stagesmart boxes and can be mapped to the various assemblies as the team knows where everyone is. #39c3

The congress used +/- 6Mwh of power (for the monitored usage) and the following used the most power:
1) Halle H (ceiling lights)
2) YoloColo
3) House of Tea

#39c3

The team handed out a bunch of keys and deployed radios in a trunking topology (instead of roaming). 6 repeaters were deployed. #39c3
The team also repaired the building to get the accessibility stairs working again. This also includes the sink hole that appeared in Saal 1. #39c3

c3mobelhaus deployed:
* 1298 standard tables
* 77 small tables
* 280 banquette tables
* 530 rentable tables
* 6741 chais

All of this weighed 118 Tonnes. #39c3

Time for CCC Internetmanufakturâ„¢:
All of these links are 100Gbps. Backbone and distribution was done by Juniper. The optics were provided by Flex Optics. EVPN-VXLAN CRB for building wide L3. #39c3
Weather map of the traffic following through congress. The YoloColo saturated the 100 Gbps link. 300 Gbit of capacity was provided by several partners. #39c3
The team managed to deploy WiFi 6 across the bulding with a peak of 10277 clients. Planning was done in Hamina which made the wifi deployment better. Almost a third of capacity was over 6Ghz. #39c3
This is the spectrum analysis of various 2.4 Ghz, 5 and 6 Ghz bands. IPv6 was about 35% of the usage. The team is considering making the YoloColo IPv6 only. Juniper SRX cluster was used as a NAT64 gateway. #39c3
The team used the same automation stack as last year. A lot of equipment was prestaged which made deployment faster. All of the patch panels were imported into netbox. DoT on the resolvers and C02 monitoring was done by BLE beacons. #39c3
The team made the network available at some hotel rooms. 2.5 Tonnes of equipment was provided by eventinfra. Peak Wifi traffic was 7.13 Gbps. Dante over EVPN is hard. Peak AS out AS57575 and the peak AS IN AS12876. 359 Patch requests via the netbox forms. #39c3
ChaosPost - 7849 post cards were forwarded to Deutsche Post of which 2145 were international and 5704 were national. #39c3
Over 1 ton of potatoes were used #39c3

Next up is the Phone Operations Center (POC). They deployed:
* ~ 11 300 configured extenstions (8500 last year), increasing by more than 32%
* 6 100 configured DECT phones
* 4 600 concurrent DECT phones
* Max 110 calls in parallel
* More than 400k calls were made (>206k DECT)

#39c3

The team also provided a blind dating service. 11 matches between users and 3 connected matches said that they had a good time. #39c3

A problem this year was SIP war dialing. In previous years, the team saw random war dialing but this year was a larger challenge. Rate limiting in the future may be implemented.

#39c3

They handed out 171 phones on loan to orgranizers. There is an eSIM provisioning service and as well as spanish and polish as additional languages. The following was deployed:
* 70 SIP Phones
* 75 DECT Antennas
* 7 EPDDI Antennas

#39c3

Next up in the C3 GSM team. The team deployed 8 base stations for 2G and 18 base stations for 4G.
Operational stats:
* 2331 SIM Cards (902 Physical and 1429 eSIMS)
* 1424 concurrently "active" devices
* 11967 calls
* 9007 SMSs
* 2 OPEN5Gs crashes - massive decrease from last year. #39c3

Next up in the Video Operations Center (VOC) or the vertical operations center 😂

Stats:
* 154h of content in 182 talks
* 11 Pallets of important stuff
* 28 cameras / 6 stages
* 12 TiB of raw audio + video material
* ~700 Submissions for infobeamer content. #39c3

The peak was 6060 viewers max. 248 ch of backup recording was done. 17ch was used for the orchestra. #39c3
The team made some changes to make transporting easier. This included making the angel introductory meeting on day 0 better. #39c3
7 seconds of audio was lost due to the Dante master clock being forcefully power cycled. Also some cables experienced some pain. #39c3

Next up is the C3 lingo team who does the translations of the event.

74 Angels translated German to English
28 Angels translated various languages (French, etc). #39c3

The lingo team has built their own hardware for the specific requirements that they have. 2 were tested last year and 9 were deployed this year. A feat worthy of "Employee of the month" #39c3
The team also synchronized data from PreTalx. Speakers helped by uploading their materials before hand so that they team could prepare for the translations. #39c3
@jarednaude
Ute✊
Why is the ground red? 🫣

@jarednaude They mentioned a more in-depth talk concerning their automation stack. Any clues where I could find that? Would be highly interested.

#39c3 #InfrastructureAsACode

@jarednaude hamina in finland?