Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Kraken.io – image optimization service (kraken.io)
93 points by kraken-io on Aug 11, 2013 | hide | past | favorite | 59 comments


Very nicely done, kraken-io! Just threw over 500 images at it (mainly PNGs, but a few JPGs and GIFs as well), virtually all of which I had optimized beforehand. For the PNGs, I had used 4 different optimizers (PNGCrush, OptiPNG, AdvPNG, and PNGOut), but almost all images saw size reductions in lossless mode ranging from under 1% to over 50% (the latter for a few of the GIFs).

This massive job queued up quickly, and the total savings figures updated in real time as the job progressed. Really appreciate the "Download all kraked files in a ZIP archive" feature, as well as the "Keep directory structure" option.

Very clean site, service, and documentation - reminds me of the classic Slicehost service.


> For the PNGs, I had used 4 different optimizers (PNGCrush, OptiPNG, AdvPNG, and PNGOut), but almost all images saw size reductions in lossless mode ranging from under 1% to over 50%

I seriously doubt you can squeeze more than 1-2% after properly running PNGOut on an image.

This service only makes sense if you have huge images that you want to optimize. $19 for 1000 8-megabyte images is a pretty good deal. That's a rare case though. If you have tiny image, just run PNGOut on them for free.


You can get decent savings (~90-95% of the optimized size) with some brute force at the encoder level, more time spent testing different filters and rearranging the palette, but it takes a lot of CPU for little gain. At some point, you might spend minutes to save bytes.

This service is much faster than that, so they probably don't use brute force at all.


1-2% might be possible by using aggressive gzip encoders such as https://code.google.com/p/zopfli/

( as seen in https://github.com/sayurin/optipng-zopfli and mentioned in https://twitter.com/pornelski/status/356843309118922756 )


We have been experimenting with Google's Zopfli but found out that the optimization time is too long for such a insignificant optimization gain.

Optimization speed has always a key factor for Kraken.io.


Another option using zopfli is pngtastic (disclaimer, my own project): https://github.com/depsypher/pngtastic

It really is significantly slower, but if you want to reduce every last byte in a png zopfli is your best bet.


I like it. I had the same idea - not that it's a difficult idea to have. Well executed.

However, I have some criticism.

> We like to think the only way to get your image files smaller after optimizing them with Kraken is to delete them.

I only tried PNG, but that claim is false. It can be made a bit smaller. The compressed sample is 65970 bytes, I got 63885 bytes here. A few additional bytes may be possible.

> It strips all metadata found in a given image

IMO, rather than stripping ICC profiles, the image should be converted to sRGB first (and the profile then stripped). You can argue that's the users job, but not everyone may know.

> max image size 8.0MB

Probably a bit small. I also don't like that you pay per image (be it 1B or 1MB). Why not charge per pixel? Or some other similar metric, because the processing power does probably increase non-linearly in relation to image dimensions.

PS. The testimonials seem fake! :)


Thanks for all the suggestions.

Testimonials are real - trust us on this one :) we contacted the authors and asked for their permissions to post their opinions on our pages.


Then you should consider linking to their websites. Will definitely make it look more authentic.


http://imageoptim.com/

Similarly you can get some pretty good optimisations with the local, and open source OSX app, which is itself just a frontend for free linux utilities.


And don't have to transmit your images over the net.. and.. and... Frankly, I don't understand how anyone would want to pay USD 250 per year to get a service which albeit behind a beautiful website just can't match the experience and control that you get when running things locally.


> Frankly, I don't understand how anyone would want to pay USD 250 per year to get a service which albeit behind a beautiful website just can't match the experience and control that you get when running things locally.

People not wanting to run things locally are exactly why we decided to launch the Kraken service in the first place - just upload and forget, let someone else deal with the complexities and use their resources. We actually decided to go commercial in response to a ludicrous number of feedback emails we were getting asking for this. It got to the point where we decided to found a company and make it happen, as it just made sense. Moreover, it's really difficult working out a pricing strategy for a product such as an image optimization API, and expect to make amendments to our pricing strategy in response to feedback and other learnings.


Thanks for the reply. As someone being into photography I would like to have the functionality that kraken provides on my local machine; even saving 10% on my huge pile of digital pictures would help me a lot; given that my machines run exclusively on SSDs, where storage space still is quite expensive.

In saying that: if you would build a downloadable app which looks great and does what you are doing right now on your servers I would be willing to spend. On the other hand, I would not upload my pics to any service just to get them back smaller; it just takes too much time.

But I agree that your biz model probably does not target someone like me; good riddance to you!


I really like the service but PLEASE please do not strip all metadata out of photos uploaded by your users.

Stripping out the EXIF/IPTC copyright information creates an orphan work (no apparent creator), makes it harder to track down copyright infringements, and may be DMCA violation (http://www.flickr.com/groups/nomorefreephotos/discuss/721576...). Ideally, you should strip out all the non-copyright EXIF fields (aperture, camera make etc) but leave the copyright info.

EXIF stripping is one of the subtle crimes us web developers have committed against content creators and it's mostly out of laziness: a few hundred bytes of copyright info will not kill us, our users or our servers.


What would be the difference from tools like http://tinypng.org and http://www.jpegmini.com/main/shrink_photo ?


Exactly what I was thinking. JPEGmini works better in my tests.


We handle all the major web image formats, with a single API. That would be the major difference. We also offer more features like image resizing, WebP compression and SVG optimization. Again, with a single API.


I'm not sure if squeezing image sizes can be a winning point for your project, pushing to s3 and cloud files was a very good feature. but if this can be extended to be a full image management solution that can easily be integrated with 3rd party CMS's like wordpress/drupal then I think it will be a killer.


Thank you for your suggestions.

In the following weeks we will be partnering with a leading CDN provider. Every single optimized image will be pushed to 32+ edge locations. We think it makes much more sense then just a S3/CloudFiles feature.

Wordpress plugin is being developed as we speak.


Excellent news! I work for a very large publisher in Canada that uses WordPress as its CMS of choice. Because we use Akamai and shrug at bandwidth costs, I don't have much of an argument there to use of this service, but shrinking all images on a page by 25% to 50% is going to have a noticeable impact on page rendering time.

The plug-in would have to make the process completely transparent, though - it should happen as part of the normal WordPress upload workflow, with not even one extra step for users to have to take.


As soon as the CDN integration is there, I will become a member. :)


I am very impressed with the website design and the well thought through and fast user interface. It seemed to be able to gain around 5% in average. This is not useful for me personally, but I guess it can lead to significant savings if you have a high-traffic website.


This looks great!

How would I use this as part of an asset pipeline? If I'm running it as part of an automated process, I'd love to see prices that support that model, as well as caching so that doing the same image multiple times doesn't increase my bill. Any thoughts?


You could integrate this, but you don't want to send every single asset to them on every deploy, only the changed ones. You should just keep a manifest file of all krakenified images, and only push them up when you add/alter one.


Well, there's where the problem is. I want to do a clean build on each deploy, and there will be many devs on many machines building it, so handling it locally isn't really possible. We could handle it by having a DB keep track, but that's not broadly applicable.

I bring this up because kraken is competing with free programs I already have installed. This is something I can do with those programs that I can't with kraken.


My quick 5 minute review of this site is: nice work. The design and feel of the site itself is very polished and professional. I threw a handful of "optimized" images at it, and sure enough it did squeeze some bytes out.

In a couple of the images I did notice a few artifacts with lossy compression, but nothing obvious, you'd have to look for it like I did, and if it were a priority you could always lossless.

Overall I'd say nice work, but I agree the "per image" price might not be optimal.

Nice work! I'll be looking into this service more seriously for a project I have coming up.


I tried quickly with a PNG screenshot, and found that the result was bigger than what ImageOptim could do. It seems to me that you're not stripping color profile information, you might want to check it!


1. How does it compare to JPEGmini?

2. You need a (good) Wordpress plugin.


I second the WordPress plugin idea - I've got a few clients I'd sign up (probably only on the micro 500/month sized plan).


We are developing our official WP plugin at the moment. We are bound to have something to show in the 1-2 weeks from now.


Surprised no one has mentioned Yahoo!'s smushit.com service yet. It even works nearly identically (minus the API).

Kraken seems to do a tad better in the handful of images I handed it (all PNG sprites), but not remarkably so.

The Google PageSpeed extension for Chrome will also give you optimized images as one of the steps, if it thinks there is room for improvement.


There is a comparison to smushit right there on the homepage. :P


Yes, but no one has mentioned it and the differences in my tests were far less drastic than the comparison would suggest.


Very nice website and service, only thing I found an issue with was the max file size - 8MB.

This is very small. Imagine if someone wanted to take a bunch of photos straight out of a camera, send it to Kraken for optimisation then have it back. They would have to do optimisations client-side beforehand, which kind of ruins the point of Kraken.


Kraken's focus is on optimizing images for the Web. We think the 8MB limit for a single image on the "Enterprise" plan is more than enough for the Web.


I'm building a service right now that is a community site for people in the fashion photography (photographers/models/etc) world. The images that these people upload are sometimes much bigger than 8MB but we still need to optimize them for display on the web.

Would love to use something like Kraken as we'll be processing a TON of images, but 8MB limit would be a deal breaker.


I'd love to see a pay as you go option. I run a SaaS where the user can select a profile picture, however, we usually get users with 100-150 at a time, then maybe nothing for a month. I'd be happy to pay 0,03$ for each picture. Also, looking forward to the CDN partnership, that will only make it better :)


Hidden in the payment page is their "micro" option of $10/mo. Not quite PAYG but still half the cheapest option they advertise.


Nice site. I found 5% savings on about 40kb. Every little helps.

It's fantastic that you've made a free web interface, but it would be great if we could enter a URL and it would find all the images on the page and optimise them rather than enter the URL's manually.


We have done exactly what wish for - by providing you with the Chrome Extension!

Just click Kraken Button, added after the plugin has been installed and wait until all the images displayed on a currently opened page are kraked.

You're welcome! :)

And here's the link: https://chrome.google.com/webstore/detail/krakenio-image-opt...


Just curious, how did you arrive at a 5am EST Sunday morning post time for your Show HN?

I'm finishing up a project I've been working on for quite a while, and I would be really interested to hear if anyone has any advice on best times to post?



What do they mean by 1.000images per month in pricing page? Just one image or thousand images?

Is it normal to use a dot there when you write thousand?


Some European countries use a period instead of a comma as a delimiter.

http://docs.oracle.com/cd/E19455-01/806-0169/overview-9/inde...


Countries that use the comma as a decimal separator use a point as a thousands separator.


I noticed that as well, they should just ditch the seperator to avoid any confusion.


Or they could format the number according to the browser's locale


The website looks extremely professional, very well done.


Thank you for your kind words. We worked very hard on getting it "just right" in terms of the balance of look and feel and pagespeed optimization, including the use of SPDY to benefit Chrome users.


I agree feels very polished.

As an aside the 4.000 images over 4,000 on the API pricing thing is temporarily confusing;

I'm guessing you're from the EU which is cool, but then pricing is in dollars. Anyway pretty sure it's an obvious mistake so no need for me to harp on.


Do you have PCI DSS certification ? The credit card number is going through your servers.


Yes, we are PCI DSS compliant. Our compliance was verified by Wirecard, our payment gateway.


How does wirecard verify your PCI DSS? It's usually a SAQ and pen scans.


Just had a thought. If I generate 8MB PNG files with noise in them and upload 1000 of such files, your servers will choke.

Maybe you should charge per MB.


You are right, our optimization workers will have a lot of work to do to optimize those images. In that case we recommend the use of "callback_url" option to eliminate the possibility of a request timeout.


No, I mean it can be used to DoS you. Give it a try. Optimizing even one 8MB noise PNG is a huge amount of CPU work.


I've just tried compressing a 12 meg png file with pngcrush locally (generated with `convert -size 1000x2000 xc: +noise Random noise.png`, it only took a few seconds and didn't tax the processor much. Maybe pngcrush handles it better than other things but that didn't seem onerous.


I would guess that the service does something more along the lines of `pngcrush -brute` which is significantly more taxing.


This Saas trend is great, dips for the Grep-api, so that you guys could pay me to do greps. The who does the date-api? /s




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: