There and Back Again

Everything is back up

The server is back up DNS should be updated for most things. Send me email jeichorn at gmail.com and josh@bluga.net if your still having any problems.

9 thoughts on “Everything is back up

  1. Ray Paseur

    Josh, I seem to be doing something wrong. I cannot get the 1280 x 1024 webthumb capture to work. It returned 1024 x 786 (not 768, 786). So a couple of questions:

    1. Am I missing something?
    2. Do you license the software you use to do the “screen capture” of the site?

    My objective is a very high quality image (probably a PNG) of a screen capture, similar to what I would get from visiting a web site and doing the equivalent of a pixel-by-pixel copy of the screen.

    Thanks, Ray

  2. Bob Walsh

    I get the following error trying your service from your home page in FF3 or Safari:

    Error, currentId, front page submission requires cookies, and a reload of the front page between thumbnail submissions

    Care to comment?

  3. Joshua Eichorn Post author

    Bob:

    For some reason your hitting my abuse checking from the front page. I provide an API and I don’t want people trying to abuse that form. If you reload the front page things should work. If not you can always sign up for an account and that should fix things.

  4. Joshua Eichorn Post author

    Ray:

    I emailed you as well but you were actually hitting a bug that was causing extra resizing.

    I just finished the upgrade of everything to the Gecko 1.9 rendering engine and 1280×1024 full thumbnails should be working correctly now. You just need to make sure you add fullthumb to your request and set png to the output type.

  5. Ray Paseur

    Josh, I just tried it again and it worked great. Thanks for your help. If you can, tell me a little more about how the rendering engine gets its view of the web page. For example, I notice that even with 1280 wide PNG output the text appears a little soft. The pictures are pretty faithful to the original, but look marginally softer, like they may have been through one set of JPG compression or some such thing. Does gecko do this?

    Here is a link to the file from BLUGA:
    http://www.imaginedb.com/webthumb/bluga.png

    Here is the direct screen capture (Firefox) made with PaintShopPro:
    http://www.imaginedb.com/webthumb/capture.png

    The screen capture is a smaller file, possibly because the capture has no dithering? Not sure why.

    Best regards, and thanks for your help, ~Ray

  6. Joshua Eichorn Post author

    Ray:

    I think the difference is just anti-aliasing on font rendering. The image in both screen captures looks identical to me.

  7. Drew

    Hi Josh,

    Awesome service you have here. I have a question. Some pages that I am going to be capturing will be larger then a normal browser window (scrollbars). Is there a command that can currently be sent to either
    1. Automatically detect the current browser size and adjust the browser capture size to this size? I know something must be in there to do that, as you can pass this information (abnormally wide and/or tall layouts) as part of the width & height captures.
    2. If the above is not possible, is there anyway for me to detect this information and pass it to your api?

  8. Joshua Eichorn Post author

    Drew:
    There currently isn’t support for autodetecting the content size. You can set the browser width/height from 75×75 to 1280×2048

    I do have some auto scrolling code in testing which would allow infinite height but its has some bugs that are keeping it out of production.

    Feel free to email me josh@bluga.net if you have any more questions

  9. Drew

    Josh,

    Gotcha on that one. That is definately something I will be looking forward too should it ever make it to production! 🙂