The term data compression identifies lowering the number of bits of info that needs to be saved or transmitted. This can be done with or without the loss of info, so what will be deleted throughout the compression shall be either redundant data or unnecessary one. When the data is uncompressed subsequently, in the first case the data and the quality will be the same, while in the second case the quality shall be worse. You'll find different compression algorithms that are more effective for different sort of information. Compressing and uncompressing data in most cases takes plenty of processing time, therefore the server executing the action needs to have sufficient resources in order to be able to process the info fast enough. A simple example how information can be compressed is to store just how many sequential positions should have 1 and how many should have 0 in the binary code as an alternative to storing the actual 1s and 0s.

Data Compression in Shared Hosting

The ZFS file system which operates on our cloud hosting platform uses a compression algorithm called LZ4. The aforementioned is significantly faster and better than every other algorithm out there, especially for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that very quickly, we are able to generate several backups of all the content kept in the shared hosting accounts on our servers daily. Both your content and its backups will need reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the servers where your content will be kept.

Data Compression in Semi-dedicated Servers

The ZFS file system which runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It is among the best algorithms out there and positively the most efficient one when it comes to compressing and uncompressing website content, as its ratio is very high and it can uncompress data faster than the same data can be read from a hard disk drive if it were uncompressed. This way, using LZ4 will quicken every site that runs on a platform where the algorithm is enabled. The high performance requires a lot of CPU processing time, that's provided by the large number of clusters working together as part of our platform. In addition, LZ4 allows us to generate several backup copies of your content every day and keep them for one month as they'll take much less space than standard backups and will be generated considerably quicker without loading the servers.