Hi, I'm Christoph, the developer of compress-or-die.com.
I hereby ask you to accept the cookies
of the self-hosted Matomo Tracking software which I use to analyze the traffic to this website.
As I am trying to get most out of the images it does not make much sense to me to work without a visual control mechanism as every image needs different parameters to get most out of it.
So, no, at the moment there is no batch tool available.
I invest a lot of CPU power in most cases to get the most out of the images. And when many users are compressing images at the same time, it means a high CPU load on the servers. With large images, the CPU load increases exponentially. And CPU-strong servers cost money.
At the moment the users who support me via Patreon cover about the server costs. The 30-40 hours I otherwise put into the project are my free time. So it's purely a money problem that keeps me from doing it.
As far as I know, the problem only exists with Adobe Photoshop, which is why there is a setting in compress-or-die-expert-mode to make the file output photoshop compatible. The file then becomes minimally larger. So if you want to be on the safe side, activate this setting. However, I have never heard that the problem exists outside of Photoshop.
Photoshop actually uses its own quantization tables and its own scaling values for the different quality values. When I mention values, they actually always refer to the standard quantization tables, which are also used by very many programs, as well as the scaling function suggested by the JPEG specification.
Below is a table that shows how the values relate to each other (determined with Photoshop 2023, Version 24.1). The first value in the "Standard Quality" table corresponds to the luminance value and the second to the chrominance value.
Caution: From time to time Photoshop's quantization tables change slightly, so there may be slight variations.