The term data compression refers to lowering the number of bits of information that needs to be saved or transmitted. This can be achieved with or without the loss of info, which means that what will be erased in the course of the compression will be either redundant data or unnecessary one. When the data is uncompressed later on, in the first case the data and its quality will be identical, whereas in the second case the quality will be worse. You can find various compression algorithms that are more efficient for various type of info. Compressing and uncompressing data often takes lots of processing time, therefore the server carrying out the action must have plenty of resources to be able to process your data fast enough. A simple example how information can be compressed is to store how many sequential positions should have 1 and how many should have 0 inside the binary code rather than storing the actual 1s and 0s.

Data Compression in Cloud Web Hosting

The ZFS file system that is run on our cloud hosting platform employs a compression algorithm named LZ4. The aforementioned is significantly faster and better than any other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data very well and it does that very quickly, we're able to generate several backup copies of all the content stored in the cloud web hosting accounts on our servers on a daily basis. Both your content and its backups will require less space and since both ZFS and LZ4 work extremely fast, the backup generation will not influence the performance of the web hosting servers where your content will be stored.

Data Compression in Semi-dedicated Hosting

The semi-dedicated hosting plans which we supply are created on a powerful cloud platform that runs on the ZFS file system. ZFS works with a compression algorithm known as LZ4 that outperforms any other algorithm out there in terms of speed and compression ratio when it comes to processing website content. This is valid especially when data is uncompressed since LZ4 does that more rapidly than it would be to read uncompressed data from a hard drive and because of this, Internet sites running on a platform where LZ4 is enabled will work at a higher speed. We can take full advantage of the feature regardless of the fact that it needs quite a large amount of CPU processing time as our platform uses numerous powerful servers working together and we never make accounts on a single machine like a lot of companies do. There's an additional advantage of using LZ4 - considering that it compresses data very well and does that very quickly, we can also generate multiple daily backup copies of all accounts without influencing the performance of the servers and keep them for a month. By doing this, you can always recover any content that you erase by mistake.