Twelve atoms are all that’s required to store a bit of computer code – a 1 or 0, according to a new discovery that probes the limit of classical data storage.
Computer hard drives on the market today use more than a million atoms to store a single bit and more than half a billion to store a byte, which is an eight-bit-long unit of code sufficient to write the letter A, for example.
The new technique uses just 96 atoms per byte, allowing for hard drives that store 100 times more information in the same amount of physical space, according the researchers behind the discovery.