Its not 10xDSD, its multibit PDM, ie pseudo PCM, which is 24bit/28mhz, ie 10x DSD rate but NOT single bit.
It is then down-converted to DSD2x and then hi-speed video switch used for i/v conversion LPF with a transformer. The FPGA upsampling has tones of gates to facilitate massive raw data crunching to give a optimized algorithm.
The big thing is the forced reclocking/buffer that resets the inputs to its own internal pace and starts the jitter story from scratch. then clever component layout avoids creating new jitter.
This is more a story of potentially great implementation to a simple and elegant starting theory, rather than raw innovation.