I have recently posted a guide on using batch tools to optimize png files. It would be great if any pngout gurus could cast a critical eye over it.
http://portableapps.com/node/648
Regards,
Andrew
Awesoken at
Nice guide, Andrew! It's nice to have all the tricks covered in one place. While reading the PNGOUT section, I noticed a few errata:
/d# : Bit depth: 0(minimize),1,2,4,8. This is only relevant to paletted images.
/d# is also relevant to /c0. If it doesn't seem to work, it's probably because my code expects the gray levels to fill the full intensity range (0-255) in this manner:
/d1: 0,255
/d2: 0,85,170,255
/d4: 0,17,34,...,238,255
for /l %%j in (0,5,5) do pngout.exe /q /b256 /f%%j /d0 %1
Wouldn't that run the mixed filter twice? Looks like a typo.
if the image contains more than 64 colours (maximum colours for 4bpp)
Only 16 colors fit in 4bpp, not 64.
ace_dent at
Thanks!
Thanks for taking the time to read over it Awesoken. I'm sure you're aware that your software has really made a difference to many communities: WIkipedia, Mobile Phone Gaming, Portable Apps, to name but a few... so a big THANK YOU! :-)
I will apply your corrections, but I think the for statement is valid (start, step, end), should just test filter 0 & 5... I will check.
Best wishes,
Andrew
counting_pine at
PNG optimization guide
Nice article :)
for /l takes its arguments in (start,step,end) format, so what you put is fine.
But, since it's only trying two values, I'd just write for %%j in (0,5) do ...
You could probably use "for" to make your batch file smaller in other ways too, e.g.
for %%j in (32k,16k,8k,4k,2k,1k,512) do optipng.exe -q -nb -nc -zw%%j -zc1-9 -zm1-9 -zs0-3 -f0-5 %1
If you're being so thorough with the PNGOUT runs, I suggest you also run AdvDef four times to try each compression level, e.g.
for /l %%j in (1,1,4) do advdef.exe -z%%j %1
(you can write -z4 instead of -z -4)
By the way, I think "/b-0" behaves the same way as "/b0", so I'd suggest "/b-256" instead. I suspect PNGOUT converts it to a number before checking the sign.
Awesoken at
Whoops, ignore what I said about the "for %%j" stuff. I incorrectly guessed that the format was a list from looking at your other examples.
Counting_pine is right about /b-0. If you want a fast run, you'll need it to be a true negative number (< 0). /b-256 is a good choice.
ace_dent at
counting_pine, I really appreciate the help. I'm pretty rubbish at Windows batch files- most of the stuff I kinda just hack together from other peoples scripts! :)
I will update the document and files with your advice. (I better also add an acknowledgements section too!) Not sure what kind of contribution running AdvDef 4x times will make... do you have experience of any benefits? I will test this out, also I will test pngout '/b-0' vs '/b-256', as it seemed to work for me...
Many thanks.
Andrew
PS- counting_pine, I'm always on the look out to improve my scripts. Have you come up with a more efficient way to brute search with pngout? ie. Colour type> filter> block size. While I know this might not achive the minimum file size, it would offer improved speed/ compression ratio.
ace_dent at
Ok, cheers Awesoken.
You posted while I was still typing :wink:
Andrew
counting_pine at
ace_dent said
Not sure what kind of contribution running AdvDef 4x times will make... do you have experience of any benefits?
Generally, the higher the compression level, the smaller the resulting file size (have you ever seen optipng -zc1 beat optipng -zc9?), but not always. Here's an example that shows that AdvDef -z1 can beat AdvDef -z4. It's perhaps a pretty extreme example, but nonetheless, it proves that it can happen.
ace_dent said
Have you come up with a more efficient way to brute search with pngout? ie. Colour type> filter> block size. While I know this might not achive the minimum file size, it would offer improved speed/ compression ratio.
I'd suggest finding out which combination of filter/colour type/bit depth produces the best results, before doing extensive tests with the block split threshold. You can work out which is best by comparing the file size after each trial to see if it goes down.
ace_dent at
Thanks again. I've now update the guide and scripts with your valuable suggestions.
Regards,
Andrew
markcramer at
I played around a while ago with my own extensive batch script, and the filesize isn't a smooth function of the block split parameter. The filesize generally rises from b=1 to some peak at low(ish) b then drops again until you hit a value of b above which the file size doesn't change. At that point the filesize will be the same as b=0.
But between the peak and final plateau, the filesize can vary considerably, sometimes staying higher than b=0, sometimes dipping lower, but it ain't always a smooth curve, so writing a script to find a minimum seems almost impossible. I remember playing with one image where the filesize at one b value was 5% smaller than at b+-1
Particularly with palleted images, where various utilities change the palette order differently, you can get marked differences in filesize.
counting_pine at
I can think of a way to make things more efficient, although it'll need more than a batch file for a front end.
I don't know what formulas PNGOUT uses, but basically it seems to calculate the set of block split points that have a calculated value higher than the threshold. As you lower the threshold more points may be added, but no points will be removed.
Anyway, the upshot of this is that:
- Two different values of /b# may well produce the same set of block split points and will compress the same way.
- Any value in between them will also compress in that way.
- Also, if one value produces one more block split point than another value, than any value in between them will compress in the same way as one of those two values
So, it's possible to save time by only trying to compress for different numbers of block split points.
A quicker way of checking the number of block split points for a value n is to run pngout /v /b-n.
So you could try a range of different values for /b# to work out how many block split points they create, discarding duplicate trials and maybe checking intermediate values, Then you could try those values with /b+n to see how well they compress.
ace_dent at
Backup copy...
Hi Guys.
It looks like PortableApp's server managed to trash my guide... and I failed to make a copy of the text. By any chance, did anyone make a copy?...
Cheers,
deus-ex at
Re: Backup copy...
ace_dent said
Hi Guys.
It looks like PortableApp's server managed to trash my guide... and I failed to make a copy of the text. By any chance, did anyone make a copy?...
Cheers,
I was about to giving you a hint how to retrieve that text back from the web but just recently its back online: http://portableapps.com/node/648
Edit: Now i see its truncated starting from a certain stage in section 1. The Google Cache-Version which i was about to hint is truncated as well. Bad luck.
Zardalu at
PNG optimization guide
Maybe something to add to your guide: After it seems that PNGOUT has squeezed out every byte it could, you can run DeflOpt 2.00 (http://www.walbeehm.com/download/) to analyse the compressed data created by PNGOUT and usually still gain some more bytes. But please be sure to understand that PNGOUT really did all the hard work. DeflOpt can NOT compress a file on its own. It merely looks at things created by compressors like PNGOUT, KZIP, 7-zip, gzip, BJWFlate, WinZip, OptiPNG, and so on, and sees if it can gain a few more bytes by storing and/or using the compression trees differently.
Ben Jos.
UncleBuck at
pngu article
I have the article saved in .htm format, I must have saved it before the server crashed.
ace_dent at
Please say you're not kidding :)
Wow, that's great!!
Please could you send me a copy of it in a zip archive?
You can email me at~ com hotmail,
ace_dent.
(Hopefully you can assemble the email address, but the spam-bots can't)
Many thanks,
Andrew
ace_dent at
Updated version of PNGu
I'm still in the process of polishing the guide I originally wrote (rescued by 'UncleBuck'). In the meantime, I've updated my 'PNGu' scripts and the included programs here:
http://people.bath.ac.uk/ea2aced/OOo/PNGu.zip
Thundik81 at
PNG optimization guide
Thanks again ace_dent.
I'm not very good with batch writing but i include one more feature : number of bits saved.
[...]title PNGuX %numpng% of %totalfiles%
echo Optimizing: %1 (%~z1b)
set z0=%~z1
[...]
echo %1 Optimized!
set /a zs=%z0%-%~z1
echo %zs%b saved!
echo
[...]
it would be nice also to backup files before optimizing (often PNGOUTWin+DeflOpt are better).
Awesoken at
- From Ken Silverman: use '/b-256' for speed [A1].
Using the /b option with negative values is deprecated. That was just a hack before I added a stategy option. The equivalent strategy option would be /s3 (Huffman-only), and you can make your /b values always positive. BTW, you can make PNGOUT run even faster using /s4 (uncompressed).
I tried your slower script on unionjack.png. After 25 minutes, it succeeded in converting it from 3011 bytes to 3079 bytes : / The problem is in your first step, when you run PNGOUT with /force. You should not assume that your script will always beat the original file size.
Also, you are wasting a lot of time on /b# values that are resulting in the same number of blocks. Have you played around with the /n# option? It's similar to /b#, except you specify the number of blocks directly. /n# is great for small files, but useless for large files that may have a lot of blocks. If you're going to use /b#, you might as well use /r in addition, so any duplicate trials are at least starting with different tables : )
ace_dent at
Thanks for these excellent suggestions- I will investigate them as time allows. I'm quite aware that the scripts aren't very clever and could search far more optimally. Also, as you point out it assumes the input png is unoptimized. This is fine for my purpose... other users obviously should alter the scripts to their needs. In truth I now mainly use PngoutWin: it's great (Thanks Ken& David)! :D
counting_pine at
One new feature of PNGOUT that I quite like is its ability to tell what filter the PNG was compressed with, and if you just run PNGOUT on it without changing the color type, or specifying a filter, it will default to the filter type that it was compressed with before.
This means you can use OptiPNG to check all the filter types (it will "usually" find and save the best one.) It will also then when you pass it to pngout it will use that filter type.
For a quick script, I'd suggest running the programs in this order:
OptiPNG - find best filter type, optimise color type
PNGOUT - compress with better compression method
AdvDef - try a different deflate method
DeflOpt - optimise the deflation, remove ancillary chunks
markcramer at
I'm still working on my Png optimizer script as well, thanks Ken for the /n /r switches, they seem to help!
I've been playing round with trying to find out if Pngout /c3 or Pngrewrite have actually modified the Palette order, and have written a utility to pull out the Crc of the PLTE chunk to do a fast check to see if it's worth doing more trials. I.e. If the Crc of the PLTE chunk hasn't changed, neither has the palette order, so it's not worth repeating any of the trials.
One thing I have been looking for is a utility to convert a graphic to a Png using a specific palette or PLTE chunk,
E.g. something like: Pngout /c3="MyPalette.Pal" File.Png
Does anyone know of a command line graphics converter that will do that, or would you consider adding that as an option Ken?
Edit: I noticed that IrfanView has an /Import_pal command line option, but that doesn't seem to work the way I imagined it would and appears to corrupt the output.
ace_dent at
Minimum block size?
Howdy. I've been looking at implementing some of the suggestions posted earlier. One idea I had was to try and exhaustively test different numbers of blocks (once pngout had found optimal parameters for palette, filter, etc.). To do this (simply) I am planning to take the file size (z), divide by the minimum block size (a) and calculate the maximum number of blocks to try (b_max).... b_max = z (bytes) / a (bytes) ... Then try /n = 0 to b_max (step size of 1?).
My questions:
1. If b_max is non-integer, how will pngout respond (would be nice to round up to whole number).
2. What should I use for a minimum block size? I have no idea of the overhead for a huffman block... should a=64 or 128 bytes for example... ?
3. If I run a set of trials to optimise the other parameters, can I then run pngout with the new '/ks' switch and retain these settings while varying the block size?
4. Is this just a crazy scheme... or has anyone bothered doing something similar?
I look forward to your feedback.
All the best- Andrew.
Awesoken at
PNG optimization guide
1. If b_max is non-integer, how will pngout respond (would be nice to round up to whole number).
/b# takes a floating point number and can result in a different number of blocks even if it rounds to the same integer. For exhaustive testing, it is better to use /n#.
2. What should I use for a minimum block size? I have no idea of the overhead for a huffman block... should a=64 or 128 bytes for example... ?
For exhaustive testing, use the /n# option. /b# is merely an option of convenience - it specifies the same thing in a less exact way. PNGOUT currently supports 1 to 1024 blocks per file (/n1 to /n1024). I usually start at 1 and increment until the file starts getting significantly bigger per iteration. Once the best n is found, I then run with /r until satisfied. If other /n# values are close, it may be worth running /r on those as well.
can I then run pngout with the new '/ks' switch and retain these settings while varying the block size?
Yes. /ks will preserve the settings (/c,/f,/d,/b) unless you have specifically overrode them at the command line.
ace_dent at
Clarification
Hi Ken.
Thanks for the rapid response. However, I think I should clarify that I was actually referring to using the '/n' rather than the '/b' switch- following your earlier suggestion. So, (1) does '/n' take a float and round?
Most of the files I work on are only a few hundred bytes, so (for example) /n1024 would result in blocks containing less than a byte. This was why I asked (2) what is a practical minimum size for blocks? What is the overhead for creating a block?
I really like the sounds of how the new '/ks' switch functions- very happy it will even preserve the number of blocks. This will allow me to run multiple '/r' trials for a final step, as you suggested.
Cheers for the help- greatly appreciated.
ace_dent at
New script: 'pngslim' - please test! ;-)
I've just uploaded the new script I've been working on 'pngslim', here: http://people.bath.ac.uk/ea2aced/tech/png/pngslim.zip
Please test & tweak; as always I look forward to your feedback... I'm also looking for someone with the interest to get this ported to Mac, to benefit icon designers on that platform.
Cheers, Andrew
Edit- Removed mirror.
Edited by ace_dent at
counting_pine at
Re: PNG optimization guide
Nice script, very thorough. Good idea to use /s1 for the stages leading up to the final randomization trials. Don't forget to try /s0 without randomized Huffman tables as well. The default trees might behave a bit differently from the randomized ones. You might also be able to shave off another byte or two if you tweak the batch file to run deflopt on each randomized output.
I'd be tempted to try the Huffman strategy with all the filter types since it behaves quite differently, and what compresses best for the pattern matching strategies might not be the best for Huffman-only compression.
ace_dent at
Great advice- thanks
Cheers counting_pine. Your suggestions now appear in the script v0.7 (updated as of now). - Added /s0 to strategy trial. - Played with adding DeflOpt to random Huff trial- minimal benefit and more complex code, so this has not been added. - Testing all Huff strategies with all filter types- Yes, added using Optipng. Changed from f0,5 to f0-5. Small impact on speed.
Thanks again. All further suggestions welcomed.
ace_dent at
Interesting images for png-geeks ;-)
These two images might interest some png -geeks / -hackers... http://people.bath.ac.uk/ea2aced/OOo/lc_duplicatepage.png http://people.bath.ac.uk/ea2aced/OOo/lc_filldraft.png
Both files have been optimized, using my original brute force script. However, they don't compress as well using my new (more intelligent?!) script. I'm interested in why this is so... but so far can't see where this extra optimization occurs.
So anyone want to have a play with these? :-)
counting_pine at
Re: PNG optimization guide
I'm curious. I can't get on to people.bath.ac.uk though. Maybe it's down, or maybe it doesn't allow off-campus connections?
ace_dent at
Sorry, Uni webspace seems to be down- >sigh< "Filestore Fault - Many services disrupted"... bah...
You can find copies here (from OpenOffice webCVS): http://ui.openoffice.org/source/browse/*checkout*/ui/ooo_custom_images/industrial/res/commandimagelist/lc_duplicatepage.png?rev=1.1.50.1 http://ui.openoffice.org/source/browse/*checkout*/ui/ooo_custom_images/industrial/res/commandimagelist/lc_filldraft.png?rev=1.1.50.1
- Also added mirror site for the script
Cheers
Edited by ace_dent at
counting_pine at
Hmm, I think it's because these images compress at similar levels with both color types 3 and 6. During the initial tests, they compress slightly better with the palette, but more extensive testing with randomised trials and deflopt show that it can compress better without it. This probably happens because they're quite small images, and the palette is about the same size as the IDAT. Obviously though, the palette isn't compressable though. More thorough testing can squeeze more out of the IDATs, but the IDAT was a lot smaller to start with for the palette image, so it doesn't make as much difference.
The only real way round this would be to be a lot more thorough with each color type, but that would be quite time-consuming. It may help a bit if you do set it up to run deflopt after each attempt with a program.
The complication isn't too bad if you set up "subroutines" in a batch file by using the ability to call batch file labels. Here's an example:
@echo off rem kflopt.bat rem usage: [call] kflopt file.png [pngout options]
:replaceifsmaller echo %~z2 if %~z2 lss %~z1 (echo y | copy %2 %1) goto :eof
Sined at
Optipng 0.5.5 is out! Updating pngslim with that new version can be useful!
Another tip: I've heard that .cmd files only work on Windows XP and next, .bat ones work on all the Windows versions (Win98, XP...). Maybe, renaming pngslim.cmd in pngslim.bat can be better?
Edited by Sined at
counting_pine at
One of the things Windows NT (/2K/XP) has that is different from Windows 9x is its command prompt. Windows 9x is built on top of DOS, and its command prompt is pretty much just a DOS prompt in a window, with long filename support.
Windows NT's command window is a bit more advanced, and it's possible to do a certain level of scripting in it. Just check out the HELP pages for commands like for, set, if, and you'll see they're a lot more extensive than the old DOS versions. I don't know whether it's by design or luck, but using the extension .cmd is a handy way of labelling a batch file so that WinNT can handle it but Win9x can't, and it allows you to write batch files that use the extra features of the WinNT command prompt without having to worry so much about what will happen if you try to run them on Win9x.
pngslim does use some of WinNT's extra features - and to good effect, such as the ability to get a file's size and compare it with what it used to be - and Win9x would probably choke on it big-time. So, it's a good idea to use the .cmd extension for it.
ace_dent at
@counting_pine Thanks for doing the analysis of those images I posted. I have been experimenting with different approaches to avoid getting stuck in local minima at the cost of final compression (while trying to ensure script remains simple). Will post my thoughts and update soon. Seems more brute force is required. It would be great to know your thoughts on a reasonable setting for minimum block size (recently changed it to 128bytes). Any ideas on the overhead per block?
@Sined Cheers for the heads up. I don't think this update will have a major improvement in compression, so I will release an updated package when I have improved the script more. The extension will remain as '.cmd' for the reasons stated. In the latest release, the script is described as 'WinXP edition'- since ported versions will follow... hopefully... ;-)
Thanks again!
captainfreedom at
Re: New script: 'pngslim' - please test! ;-)
ace_dent said at
I've just uploaded the new script I've been working on 'pngslim', here: http://people.bath.ac.uk/ea2aced/tech/png/pngslim.zip
Please test & tweak; as always I look forward to your feedback... I'm also looking for someone with the interest to get this ported to Mac, to benefit icon designers on that platform.
Cheers, Andrew
I tried this script and it manages to shave off another .5% from pngs that are already optimised with pngout. It's a pity that you have to use drag and drop, becuase there is some sort of limit of the amount of files that can be dropped in one go. It would be much better if you could just make it do all pngs within a directory. Anyways thanks for the script ;)
ace_dent at
Re: PNG optimization guide
Posted updated version of 'pngslim' v0.9. Only minor update with no changes that will improve compression. I have yet to find time to improve the script and work on porting. It would be great if anyone out there can collaborate to get this ported to OS X and Linux...
Posted updated version of 'pngslim' v0.9. Only minor update with no changes that will improve compression. I have yet to find time to improve the script and work on porting. It would be great if anyone out there can collaborate to get this ported to OS X and Linux...
Hi Andrew, I must thank you for writing such a useful script. I am using to get my mobile game below the size limit. Unfortunately, the "drag and drop" method of selecting the files imposes a limit of 20 files at a time (it's some stupid windows limitation). This prevents you running the script overnight and processing a large number of files takes a long time (several hours). This make thing very awkward. It would be very nice if you could change it so that all pngs in a certain directory are process in one batch. I tried to modify the script myself, but unfortunately I couldn't make head of tale of it :(
ace_dent at
drag and drop - WinXP limits...
Howdy.
Glad the script has been of some use. From the 'readme.txt':
Note for MS Windows users, due to limitations of the OS the maximum number of files that you can drag-n-drop, depends on the total text length of the image file paths+names. If you see the following error message: "Windows cannot access the specified device, path, or file...", you should reduce the number of selected files per drag-n-drop (typically <100), and consider moving your files to shorten the path names (e.g. "C:\png\").
Try creating "C:\png\" and placing your files there. I can normally optimize a batch of ~90 files. Since I often optimize 1000+ files, I just leave multiple batches (e.g. 4x 90) running overnight. Have you checked out PNGOutWin? The extra 0.5% improvement (mentioned in your prior post), is probably due to the lack of restrictions for mobile phone compatibility... I hope this doesn't cause you problems.
I don't have time at the moment to add your feature request (optimize whole directory), but will put it on my TODO list. Maybe some other user could implement this... and please re-post the code here! ;-)
Cheers, Andrew
ace_dent at
pngrewrite - necessary?
Hi- a question aimed mostly at 'counting_pine'... (other answers welcomed!) I currently still run 'pngrewrite' as a first step- mainly for historical reasons. It now seems Pngout handles all color and palette reductions anyway... Is it necessary for me to include pngrewrite in my pngslim script any more?
Cheers, Andrew
counting_pine at
Re: PNG optimization guide
Hi Andrew,
I would say running pngrewrite is no longer necessary now. Since you run pngout /s4 /force before pngrewrite, that should both optimise the color type and remove any ancillary chunks. Also, optipng should catch any optimisations that pngout doesn't.
I couldn't guarantee that the script will consistently produce the same results with and without pngrewrite, but that would mainly be due to small things like palette ordering, or perhaps disagreements about the 'optimal' color type to use in some cases. Neither way will be definitively better or worse.
ace_dent at
Cheers for the information 'counting_pine'. I will still experiment a little more before dropping pngrewrite.
Regards, Andrew
captainfreedom at
Re: drag and drop - WinXP limits...
ace_dent said at
The extra 0.5% improvement (mentioned in your prior post), is probably due to the lack of restrictions for mobile phone compatibility... I hope this doesn't cause you problems.
thanks for the answer. When you talk about phone compatibilty, you are talking about the /mincodes2 option in pngout, right?
ace_dent at
Minor update: pngslim v091
Posted updated version of 'pngslim' v0.91. Minor update (mainly to add license details for pngout), with a small change that may improve compression in some circumstances. http://people.bath.ac.uk/ea2aced/tech/png/pngslim.zip
Re. mobile phone compatibility, yes: /mincodes2 option in pngout is not used. I also suspect that DeflOpt will not worry about those restrictions either... In fact, pngout is the only program that has explicitly added a config option to cover this issue- so I'd assume that other programs may or may not write 'nice' png files.
Cheers, Andrew
captainfreedom at
Re: Minor update: pngslim v091
ace_dent said at
Posted updated version of 'pngslim' v0.91. Minor update (mainly to add license details for pngout), with a small change that may improve compression in some circumstances. http://people.bath.ac.uk/ea2aced/tech/png/pngslim.zip
Re. mobile phone compatibility, yes: /mincodes2 option in pngout is not used. I also suspect that DeflOpt will not worry about those restrictions either... In fact, pngout is the only program that has explicitly added a config option to cover this issue- so I'd assume that other programs may or may not write 'nice' png files.
Cheers, Andrew
Ah well, in that case I will just run pngout /mincodes2 /force after running pngslim and it will sort things out (by the way /mincodes2 does not make files much bigger, if at all)
ace_dent at
New release: v1.0 Beta 1
A few changes in this release, so I've marked it a Beta until I get some feedback from the community. If anyone has any suggestions / improvements- please jump in!
v1.0beta3 23-September-2009 - Update introduces first step in making the script a bit smarter and efficient. - Bug fix, for setting the program path using the 'App' directory (from 1.0b2). - Improved the routine for determining the optimum number of Huffman blocks, and the Deflate strategy used by pngout. It should be noticeably faster for large images, and may yield better compression in some cases. - This is the first part of the script re-write, introducing improvements in readability and the addition of debug/ geek feedback in the Verbose mode (v=0/1).
TODO - Remove 'Pngrewrite' to evaluate impact on compression. AFAIK all features are now replicated in the other software used. - UPX compress the bundled applications. - Smarter processing for greatly improved speed. Next bottleneck is selecting the optimal color and filter settings (Trial 1). Will adjust amount of brute force according to image size, and optimize in a more iterative way. - Developing a PNG Test Corpus- to validate compression and speed improvements. Please feel free to suggest improvements.
... As always, feedback welcomed...
Cheers Andrew
ace_dent at
Version 1 has just been released. Have fun! ;-)
caveman at
Hello, Andrew, I would like to port your script to Bash (Linux / Mac OS X), do you have any objection?
I tried to get in touch with Ben Jos Walbeehm (DeflOpt author) since I would like to use a Linux version of DeflOPt (and eventually have access to its source to compile it for Mac OS X), alas he never answered to my emails...
Concerning the PNG test corpus, there is a web page about PNG optimisations comparing different tools (and your script) that uses specific files: http://www.css-ig.net/comparatif-compression-png (it's in French, I'm a native French speaker, if you want to contact the author I could help you).
Cheers Mathias
Edited by caveman at
ace_dent at
Hi Mathias
My script is public domain- please grab it, improve/ re-write /tinker as you want. I would only ask you release your script under the same liberal PD license- but of course that is entirely your decision. The goal has always been to get this ported to a bash script and running on a Mac, since most designers are running on that platform. I don't have the skill or time- so it would be great if you did this.
Please note, all of the software bundled with pngslim is released under different licenses. I certainly know you wont be able to re-distribute pngout without the kind permission of Ken and David (Ardfry.com).
I had already grabbed the png images from css-ig.net for my Test Suite. It was their benchmarking that partly prompted me to update pngslim. There is still considerable room to improve pngslim, and I hope your version will take the script further.
Cheers Andrew
CeeJay.dk at
Hi Andrew
I was researching ways to improve a very old script of mine that did the same as yours when I stumbled upon this thread and your script. Seeing as your script is more advanced and intelligent , I've decided to abandon my own script and instead incorporate my ideas into your script.
I'm still figuring out all the little details of your code , but I've managed to improve it's handling of unrecognised PNG files (usually 64-bit PNGs) so they too can be optimized. I use i_view32.exe from Irfanview http://irfanview.com/ to save the PNGs that PNGOUT does not recognise into a format it does recognise.
I have rewritten the first part of stage 10 to look like this :
Irfanview outputs errorlevel 1 if it cannot load a file or understand it's format and errorlevel 2 if it cannot save it.
I run pngout -q -s4 -f0 -c6 -k0 -force %1 again because I figured out that you use it not only to strip unnecessary chucks , but also to measure how big the PNG is uncompressed and then determine the best strategy based on that. I could also have used a goto stage10, but that would have caused an infinite loop in the unlikely case that Irfanview would output a file that pngout could not read.
I suppose it would output a 64bit PNG if the file actually had more colours than what could be contained in an 24/32-bit PNG , but I have never seen such a file. If you know of some testfile to test this case with , I'd like to know.
Edited by CeeJay.dk at
CeeJay.dk at
I'd like to note that Irfanview saves it's settings in it's ini file and you can set the level of compression it should use. You can also tell it to use it's PNGOUT plugin , but the plugin has fewer options than pngout.exe.
Still it could be useful. It might be used instead of the pngout -q -s4 -f0 -c6 -k0 -force %1 step.
CeeJay.dk at
Here is a little tip if you're using Irfanview or another image viewer or editor that allows you to open the files with an external program. You can set it up to use pngslim.cmd as your external program and easily run pngslim on the image you are viewing or editing.
In Ifanview press P and then M ( to go to Options -> Properties -> Miscellaneous ) and enter the path to pngslim.cmd where it says "Set external editors"
If you entered in pngslim as your first editor , you can call it at any time with Shift + E If you entered it as your second or third editor you can call it from File -> Open with external editor
cssig at
PNG tests
Just an update : http://www.css-ig.net/png-optimisation-compression Bonne continuation.