Data compression is the compacting of data by lowering the number of bits that are stored or transmitted. This way, the compressed data will take less disk space than the initial one, so more content can be stored using the same amount of space. You will find various compression algorithms that function in different ways and with a lot of them only the redundant bits are removed, therefore once the data is uncompressed, there's no decrease in quality. Others remove excessive bits, but uncompressing the data subsequently will lead to lower quality in comparison with the original. Compressing and uncompressing content needs a large amount of system resources, in particular CPU processing time, therefore every web hosting platform that uses compression in real time needs to have adequate power to support this attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of storing the entire code.
Data Compression in Shared Website Hosting
The compression algorithm which we use on the cloud web hosting platform where your new shared website hosting account shall be created is known as LZ4 and it's applied by the advanced ZFS file system which powers the system. The algorithm is greater than the ones other file systems use because its compression ratio is a lot higher and it processes data considerably faster. The speed is most noticeable when content is being uncompressed as this happens at a faster rate than data can be read from a hard disk drive. Because of this, LZ4 improves the performance of any website located on a server which uses the algorithm. We use LZ4 in an additional way - its speed and compression ratio make it possible for us to make a couple of daily backups of the whole content of all accounts and keep them for 30 days. Not only do these backup copies take less space, but in addition their generation does not slow the servers down like it often happens with other file systems.