Category : amazon-s3

I’ve already added the AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY AWS_DEFAULT_REGION AWS_BUCKET in Heroku and the according value to Heroku Config Vars. Then,I uploaded image to ‘/images’ folder on s3. $path = $request->file(‘image’)->store(‘/images’, ‘s3′); After that, Heroku server showed the following error´╝Ü Is there anyone can help me explain what’s going on? Thanks a lots. I’m trying figure out… ..

Read more

Have a laravel project and code block like below $pendingToProcessFiles = PendingToProcessFile::where(‘tries’, ‘<‘, 3)->orderBy(‘created_at’)->get(); Log::channel(‘upload-to-s3-jobs-log’)->info("Using queue {$this->queue}"); foreach ($pendingToProcessFiles as $index => $pendingToProcessFile) { $jobObject = new UploadToS3Job($pendingToProcessFile); dispatch($jobObject->onQueue($this->queue)); } This works as expected in my local machine (laradock), and doesn’t matter if I have a lot of records in table, that belongs to PendingToProcessFile ..

Read more

I’ve made an image upload that works by sending the base64 encoded image from vue to laravel and then upload it to aws s3 by doing `Storage::disk(‘s3’)->put(‘mys3path/’.$request->filename, json_decode($request->file)); Now, I want to do the same with the video but I noticed that it’s a very heavy task since video sizes are much larger than images. ..

Read more

I am trying to get my assets stored in S3 by calling a API implemented as Lambda function using Bref Serverless Package. I am using Laravel. A call to Storage::directories(); give the following error. message: Error executing "ListObjects" on "https://calmed-storage.s3.us-west-2.amazonaws.com/?prefix=&delimiter=%2F&encoding-type=url"; AWS HTTP error: Client error: GET https://mys3-storage.s3.us-west-2.amazonaws.com/?prefix=&delimiter=%2F&encoding-type=url resulted in a 403 Forbidden response: The AWS ..

Read more

Can anyone help me to refactor my code to match the second code? The problem is I changed the method to use the AWS S3multipartupload and am not able to put the file in the subfolder of the bucket. All files are uploaded at the root level and I can’t create folder or make a ..

Read more

I have a working file upload on my vuejs(dashboard) & laravel(restful api), its all working properly on my local dev environment, but when I deploy it on my aws es2 instance, it has error saying : InvalidArgumentException: The putObject operation requires non-empty parameters: Bucket Bucket in file /var/www/api.myapp.com/public_html/myapp-be/vendor/aws/aws-sdk-php/src/InputValidationMiddleware.php I wonder if Storage::disk(‘s3′)->put($file_path.’/’.$file_name, base64_decode($file)); stores the ..

Read more

I have 2 apps, 1 for dashboard (VueJS) and 1 for api (Laravel)… In my laravel, I have an api function that uploads an image to my s3 bucket : class ImageController extends Controller { public function __construct() { $this->middleware(‘auth:api’, [‘except’ => [‘uploadImage’]]); $this->middleware(‘cors’); } public function uploadimage(Request $request) { $disk = Storage::disk(‘s3′); $disk->put($request->path.’/’.$request->file_name, base64_decode($request->file)); ..

Read more

I have a function in my laravel that uploads and image in my s3 bucket: $disk = Storage::disk(‘s3’); $disk->put($request->path . ‘/’ . $request->file_name, base64_decode($request->file)); The problem is that, the newly uploaded image cannot be accessed. Now, I have a folder named public in my S3 bucket what I wanted is that by default, public folder ..

Read more

One of my API’s routes associated with Laravel controller that returns URL of image stored on AWS S3. I have function that looks like public function getImage($params) { //… $image is fetched from database return Storage::disk(‘s3′)->response("some_path/".$image->filename); } This codes works fine when I’m requesting few images, but when I try to use it inside some ..

Read more

I am attempting to upoad files using laravel livewire. I have created my S3 bucket. I have also created my user with AWSS3FullAccessRights. By default my s3 bucket has public access turned off. When I upload a files I am getting a 403 forbidden error. The request URL is: https://[bucket_name].s3.ap-southeast-2.amazonaws.com/livewire-tmp/HeFkgK7BTeFv2voaPzMPJdwqX69KQ4-metac2FtcGxlLW1wNC1maWxlLm1wNA%3D%3D-.mp4?x-amz-acl=private&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAS3BSKYVMYVLYCB44%2F20201025%2Fap-southeast-2%2Fs3%2Faws4_request&X-Amz-Date=20201025T041359Z&X-Amz-SignedHeaders=host%3Bx-amz-acl&X-Amz-Expires=300&X-Amz-Signature=9e87a654cbeab84772c18a950741027e7eeea1ad41b28e7b1c84a60cf2266147. The file is not getting ..

Read more

I implemented image resize method for all existing product images.but it will gave error message like this production.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 86736576 bytes) {"exception":"[object] (SymfonyComponentDebugExceptionFatalErrorException(code: 1): Allowed memory size of 134217728 bytes exhausted (tried to allocate 86736576 bytes) at /home/httpd/htdocs/qa/ebeyonds_oos/api/vendor/intervention/image/src/Intervention/Image/AbstractDecoder.php:236) exists image saved read from aws s3 bucket ..

Read more

I implemented image resize method for all existing product images.but it will gave error message like this production.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 86736576 bytes) {"exception":"[object] (SymfonyComponentDebugExceptionFatalErrorException(code: 1): Allowed memory size of 134217728 bytes exhausted (tried to allocate 86736576 bytes) at /home/httpd/htdocs/qa/ebeyonds_oos/api/vendor/intervention/image/src/Intervention/Image/AbstractDecoder.php:236) exists image saved read from aws s3 bucket ..

Read more

I’m trying to create image which stored in Amazon S3 to Stripe, but I got a problem while creating file. This is my code used to create file to Stripe use StripeFile; … $fp = fopen($filePath, ‘r’); return File::create([ ‘file’ => $fp, ‘purpose’ => ‘identity_document’ ], $this->stripeOptions()); Error was thrown when I tried to created ..

Read more