50 million rendered polygons vs one spicy 4.2MB boi

https://lemmy.world/post/15387624

50 million rendered polygons vs one spicy 4.2MB boi - Lemmy.World

Everybody gangsta still we invent hardware accelerated JSON parsing
ieeexplore.ieee.org/document/9912040 “Hardware Accelerator for JSON Parsing, Querying and Schema Validation” “we can parse and query JSON data at 106 Gbps”

106 Gbps

They get to this result on 0.6 MB of data (paper, page 5)

They even say:

Moreover, there is no need to evaluate our design with datasets larger than the ones we have used; we achieve steady state performance with our datasets

This requires an explanation. I do see the need - if you promise 100Gbps you need to process at least a few Tbs.

But to write such a file you need a few quantum computers map reducing the data in alternative universes