TechHui

Hawaiʻi's Technology Community

I'm working on a PHP4 class to use the REST protocol for Amazon S3 that does not require CURL or PEAR HTTP_Request. I hadn't been able to find a PHP4 class that worked for me. I didn't want to use PEAR or CURL.

So far, I've got putting objects to work. My next task was to work on copying objects. I have been trying for a couple days so far and I can't figure out what I'm doing wrong. I'd really appreciate someone's help with getting the class to copy. My problem with copy is that the script times out and doesn't even get a response from Amazon.

In my webapp, I have a photo album. I wanted to allow users to download original sized versions of photos. To prevent hotlinking, I want to copy a version of the file with private ACL and use content-disposition to prompt a download. Then I wanted to generate a signed GET url to refer to the download version of the image.

Views: 230

Attachments:

Replies to This Discussion

just curious, but why would you not want to use curl or pear? why wouldn't you want to leverage existing classes that are doing the trick?

I'm getting up to speed w/ this (php) myself and wonder about your curl/pear comment.
Hi Truman,

Sent you a link via twitter but here are the S3 classes that I know of. These are just based off of my bookmarks and none were written for PHP4 but it might help as a reference.

1) Storage3
2) Standalone S3 client using CURL
3) Pear S3 Package
4) IBM Developer's Works
Truman,

I haven't run your code yet, but at first glance, I noticed this line in your sources for copy() at Line 336:

$request_uri = '/' . $target;

I don't see where $target gets defined. I do see $target_bucket and $target_object coming in as parameters. This may be where your problem lies.

You might also need to set "Content-Length" to zero in your headers so Amazon knows you're sending a zero-length PUT request.

Let me know if either turns out to be the root cause of your woes. :-)
Thanks, guys, for taking the time to looking into this and reply.

Peter - I didn't want the extra overhead of PEAR.

Scott - thanks for taking the time to send me those links. I'd seen all of them except the first one. Only #2 supports the newish copy feature that Amazon unveiled late last year. I looked at Donovan Schönknecht's standalone s3 client class and my function seems to accommodate all the required settings and more than his does.

Laurence - thanks for catching that bug. I'd forgotten to change that when I added the support of being able to copy an object from one bucket to another. I fixed it in the code. And I put back in the content-length header parameter which I had in there before. Unfortunately, it still doesn't work. I'm stumped.
Looking further into the code - Since you're not passing file_data to your call to http_query(), you aren't appending the two newlines after your PUT request -- which is probably the show-stopper, as Amazon is waiting for you to terminate your PUT request.

Around Line 183:
                if ($file_data) {
                        $query .=  "\n\n" . $file_data;
}
Laurence, you are da bomb! The two newlines finally let me get some response out of Amazon. I didn't realize that two newlines terminates PUT requests. Is that a REST convention?

Now, I've got a problem with the formatting of my "string to sign" but I'm sure I can work that out.

I owe you lunch at a nice place!
Glad to help out a fellow coder - sometimes all it takes is a fresh pair of eyes. :-)

I'll happily accept a good cup of coffee sometime, or a professional recommendation in case market conditions force me to relocate to Seattle or the Bay.

About the two carriage returns - I think you need at least one to satisfy HTTP Requirements, per RFC 2616 section 5. It's a rather dull read, though.

RSS

Sponsors

web design, web development, localization

© 2024   Created by Daniel Leuck.   Powered by

Badges  |  Report an Issue  |  Terms of Service