Very nicely done, kraken-io! Just threw over 500 images at it (mainly PNGs, but a few JPGs and GIFs as well), virtually all of which I had optimized beforehand. For the PNGs, I had used 4 different optimizers (PNGCrush, OptiPNG, AdvPNG, and PNGOut), but almost all images saw size reductions in lossless mode ranging from under 1% to over 50% (the latter for a few of the GIFs).
This massive job queued up quickly, and the total savings figures updated in real time as the job progressed. Really appreciate the "Download all kraked files in a ZIP archive" feature, as well as the "Keep directory structure" option.
Very clean site, service, and documentation - reminds me of the classic Slicehost service.
> For the PNGs, I had used 4 different optimizers (PNGCrush, OptiPNG, AdvPNG, and PNGOut), but almost all images saw size reductions in lossless mode ranging from under 1% to over 50%
I seriously doubt you can squeeze more than 1-2% after properly running PNGOut on an image.
This service only makes sense if you have huge images that you want to optimize. $19 for 1000 8-megabyte images is a pretty good deal. That's a rare case though. If you have tiny image, just run PNGOut on them for free.
You can get decent savings (~90-95% of the optimized size) with some brute force at the encoder level, more time spent testing different filters and rearranging the palette, but it takes a lot of CPU for little gain. At some point, you might spend minutes to save bytes.
This service is much faster than that, so they probably don't use brute force at all.
I like it. I had the same idea - not that it's a difficult idea to have. Well executed.
However, I have some criticism.
> We like to think the only way to get your image files smaller after optimizing them with Kraken is to delete them.
I only tried PNG, but that claim is false. It can be made a bit smaller. The compressed sample is 65970 bytes, I got 63885 bytes here. A few additional bytes may be possible.
> It strips all metadata found in a given image
IMO, rather than stripping ICC profiles, the image should be converted to sRGB first (and the profile then stripped). You can argue that's the users job, but not everyone may know.
> max image size 8.0MB
Probably a bit small. I also don't like that you pay per image (be it 1B or 1MB). Why not charge per pixel? Or some other similar metric, because the processing power does probably increase non-linearly in relation to image dimensions.
Similarly you can get some pretty good optimisations with the local, and open source OSX app, which is itself just a frontend for free linux utilities.
And don't have to transmit your images over the net.. and.. and... Frankly, I don't understand how anyone would want to pay USD 250 per year to get a service which albeit behind a beautiful website just can't match the experience and control that you get when running things locally.
> Frankly, I don't understand how anyone would want to pay USD 250 per year to get a service which albeit behind a beautiful website just can't match the experience and control that you get when running things locally.
People not wanting to run things locally are exactly why we decided to launch the Kraken service in the first place - just upload and forget, let someone else deal with the complexities and use their resources. We actually decided to go commercial in response to a ludicrous number of feedback emails we were getting asking for this. It got to the point where we decided to found a company and make it happen, as it just made sense. Moreover, it's really difficult working out a pricing strategy for a product such as an image optimization API, and expect to make amendments to our pricing strategy in response to feedback and other learnings.
Thanks for the reply. As someone being into photography I would like to have the functionality that kraken provides on my local machine; even saving 10% on my huge pile of digital pictures would help me a lot; given that my machines run exclusively on SSDs, where storage space still is quite expensive.
In saying that: if you would build a downloadable app which looks great and does what you are doing right now on your servers I would be willing to spend. On the other hand, I would not upload my pics to any service just to get them back smaller; it just takes too much time.
But I agree that your biz model probably does not target someone like me; good riddance to you!
I really like the service but PLEASE please do not strip all metadata out of photos uploaded by your users.
Stripping out the EXIF/IPTC copyright information creates an orphan work (no apparent creator), makes it harder to track down copyright infringements, and may be DMCA violation (http://www.flickr.com/groups/nomorefreephotos/discuss/721576...). Ideally, you should strip out all the non-copyright EXIF fields (aperture, camera make etc) but leave the copyright info.
EXIF stripping is one of the subtle crimes us web developers have committed against content creators and it's mostly out of laziness: a few hundred bytes of copyright info will not kill us, our users or our servers.
We handle all the major web image formats, with a single API. That would be the major difference. We also offer more features like image resizing, WebP compression and SVG optimization. Again, with a single API.
I'm not sure if squeezing image sizes can be a winning point for your project, pushing to s3 and cloud files was a very good feature. but if this can be extended to be a full image management solution that can easily be integrated with 3rd party CMS's like wordpress/drupal then I think it will be a killer.
In the following weeks we will be partnering with a leading CDN provider. Every single optimized image will be pushed to 32+ edge locations. We think it makes much more sense then just a S3/CloudFiles feature.
Excellent news! I work for a very large publisher in Canada that uses WordPress as its CMS of choice. Because we use Akamai and shrug at bandwidth costs, I don't have much of an argument there to use of this service, but shrinking all images on a page by 25% to 50% is going to have a noticeable impact on page rendering time.
The plug-in would have to make the process completely transparent, though - it should happen as part of the normal WordPress upload workflow, with not even one extra step for users to have to take.
I am very impressed with the website design and the well thought through and fast user interface. It seemed to be able to gain around 5% in average. This is not useful for me personally, but I guess it can lead to significant savings if you have a high-traffic website.
How would I use this as part of an asset pipeline? If I'm running it as part of an automated process, I'd love to see prices that support that model, as well as caching so that doing the same image multiple times doesn't increase my bill. Any thoughts?
You could integrate this, but you don't want to send every single asset to them on every deploy, only the changed ones. You should just keep a manifest file of all krakenified images, and only push them up when you add/alter one.
Well, there's where the problem is. I want to do a clean build on each deploy, and there will be many devs on many machines building it, so handling it locally isn't really possible. We could handle it by having a DB keep track, but that's not broadly applicable.
I bring this up because kraken is competing with free programs I already have installed. This is something I can do with those programs that I can't with kraken.
My quick 5 minute review of this site is: nice work. The design and feel of the site itself is very polished and professional. I threw a handful of "optimized" images at it, and sure enough it did squeeze some bytes out.
In a couple of the images I did notice a few artifacts with lossy compression, but nothing obvious, you'd have to look for it like I did, and if it were a priority you could always lossless.
Overall I'd say nice work, but I agree the "per image" price might not be optimal.
Nice work! I'll be looking into this service more seriously for a project I have coming up.
I tried quickly with a PNG screenshot, and found that the result was bigger than what ImageOptim could do. It seems to me that you're not stripping color profile information, you might want to check it!
Very nice website and service, only thing I found an issue with was the max file size - 8MB.
This is very small. Imagine if someone wanted to take a bunch of photos straight out of a camera, send it to Kraken for optimisation then have it back. They would have to do optimisations client-side beforehand, which kind of ruins the point of Kraken.
Kraken's focus is on optimizing images for the Web. We think the 8MB limit for a single image on the "Enterprise" plan is more than enough for the Web.
I'm building a service right now that is a community site for people in the fashion photography (photographers/models/etc) world. The images that these people upload are sometimes much bigger than 8MB but we still need to optimize them for display on the web.
Would love to use something like Kraken as we'll be processing a TON of images, but 8MB limit would be a deal breaker.
I'd love to see a pay as you go option. I run a SaaS where the user can select a profile picture, however, we usually get users with 100-150 at a time, then maybe nothing for a month. I'd be happy to pay 0,03$ for each picture. Also, looking forward to the CDN partnership, that will only make it better :)
Nice site. I found 5% savings on about 40kb. Every little helps.
It's fantastic that you've made a free web interface, but it would be great if we could enter a URL and it would find all the images on the page and optimise them rather than enter the URL's manually.
Just curious, how did you arrive at a 5am EST Sunday morning post time for your Show HN?
I'm finishing up a project I've been working on for quite a while, and I would be really interested to hear if anyone has any advice on best times to post?
Thank you for your kind words. We worked very hard on getting it "just right" in terms of the balance of look and feel and pagespeed optimization, including the use of SPDY to benefit Chrome users.
As an aside the 4.000 images over 4,000 on the API pricing thing is temporarily confusing;
I'm guessing you're from the EU which is cool, but then pricing is in dollars. Anyway pretty sure it's an obvious mistake so no need for me to harp on.
You are right, our optimization workers will have a lot of work to do to optimize those images. In that case we recommend the use of "callback_url" option to eliminate the possibility of a request timeout.
I've just tried compressing a 12 meg png file with pngcrush locally (generated with `convert -size 1000x2000 xc: +noise Random noise.png`, it only took a few seconds and didn't tax the processor much. Maybe pngcrush handles it better than other things but that didn't seem onerous.
This massive job queued up quickly, and the total savings figures updated in real time as the job progressed. Really appreciate the "Download all kraked files in a ZIP archive" feature, as well as the "Keep directory structure" option.
Very clean site, service, and documentation - reminds me of the classic Slicehost service.