omnscient0
Admin posted:
I am skeptical how much a free CDN like Cloudflare would help. The biggest issue is overall bandwidth used, and over a large quantity of different images so it wouldn't cache well. An actual CDN would be amazing, but would cost a non trivial amount.
Is there really much size to be saved by losslessly optimizing current images? They should already be optimized pretty well when they are released by the scanlation group. There are probably a few outliers that weren't exported properly, but not sure if it is worth re-processing the entire collection for those outliers. The bigger issue I feel would be releases that are in a much higher quality then needed for web viewing (ex. 2400px tall) that I have noticed at times.
There are almost no GIF and TIF images on the site currently.
HTTPS is something I have been meaning to do for a while. As well as doing a pass on fixing bugs on mobile.
Gzip compression has always been enabled and appears to be working fine for me.
As a scanlation group editor (for multiple groups), I can tell you that the optimization for images varies wildly from group to group. Programs used and their implementations vary and the groups usage of those programs and implementations also varies, causing complexity. I can tell you that on average you would save 4-10% across the board with most manga pngs using optipng (or, even better, trimage and the libs it uses). I can also tell you the jpgs across the website are mostly unoptimized and are using basic huffman tables, instead of optimized ones.
I mean, a basic free CDN is preferable for a large amount of static files than just pulling it from the server every single time. Even if it doesn't speed it up, the site will save lots of bandwidth which will save Gh0st some money. I mean, you could try it and see if it improves anything (even in a beta website, not the main, to see if it improves anything)?
Anyway, back to optimization for the images. It's also important to note that the older the images are, the more likely they are to be unoptimized, just due to the nature of technology and programming. So maybe selectively do newer stuff, but overhaul the older assets we have?
You're right about the resolution. Some people use 1024 x 768 screens while others view the website on their 4K monitors and TVs. There should probably be a set standard for website resolution and then if people want the gigantic ones, just link them to the releasers zips and stuff?