Wednesday, August 22, 2018

Why do we still transmit data via binary?

I understand the use of binary at the hardware level. Gates are considerably less complicated when just on/off. It also makes sense from an error correction standpoint.

Network data however... its history is converting analog to digital. It seems like with the sophistication we have today, bandwidth could be increased dramatically by going back to a non-binary method.

Bear with me. Why not use different spectrums of light to represent any given byte. Instead of 8 bits having to be passed, you can do the same with a single differentiation of color. It doesn't seem like it'd be a stretch to create 256 easily demarcated bands of light considering how vast the overall spectrum is.

Even if you have to send 3 passes for error correction, bandwidth would increase 150%, which would be more because you could shortcircuit the 3rd pass assuming the first 2 matched.

What am I missing here? Is it an issue of degradation?

edit: Nevermind. It's 2am, and I'm not that bright. Packets, not 1s and 0s.



No comments:

Post a Comment