Aws s3 copy and replicate folder in laravel












1















I am trying to copy a folder which is already on s3 and save it with different name on S3 in laravel 5.4 . What I have found so far is that I can copy an Image, Not folder. I have tried to copy folder like ie:



 $disk->copy("admin/form/$old_form","admin/form/$new_form");


But it doesnot work like that. It give me an error. Do i need to apply loop and get each folder item separately? Like:



$images = $disk->allFiles('admin/form/$id');


Or is there any work around available in laravel or s3 api it self?



Please help, Its driving me crazy.



Thanks in advance.










share|improve this question























  • Seriously no one?

    – Faran Khan
    Jul 5 '17 at 6:01











  • first, s3 doesnt have the concept of folders, so copy/rename a folder wont work github.com/thephpleague/flysystem-aws-s3-v3/issues/128 , funny that delete do work. for laravel it doesnt allow copying folder but you can still use laravel-recipes.com/recipes/148/… which needs the full path for both old & new to make the copy or you would get an error. i've tried with local successfully but havent tested with s3 yet

    – ctf0
    Dec 20 '17 at 1:50
















1















I am trying to copy a folder which is already on s3 and save it with different name on S3 in laravel 5.4 . What I have found so far is that I can copy an Image, Not folder. I have tried to copy folder like ie:



 $disk->copy("admin/form/$old_form","admin/form/$new_form");


But it doesnot work like that. It give me an error. Do i need to apply loop and get each folder item separately? Like:



$images = $disk->allFiles('admin/form/$id');


Or is there any work around available in laravel or s3 api it self?



Please help, Its driving me crazy.



Thanks in advance.










share|improve this question























  • Seriously no one?

    – Faran Khan
    Jul 5 '17 at 6:01











  • first, s3 doesnt have the concept of folders, so copy/rename a folder wont work github.com/thephpleague/flysystem-aws-s3-v3/issues/128 , funny that delete do work. for laravel it doesnt allow copying folder but you can still use laravel-recipes.com/recipes/148/… which needs the full path for both old & new to make the copy or you would get an error. i've tried with local successfully but havent tested with s3 yet

    – ctf0
    Dec 20 '17 at 1:50














1












1








1








I am trying to copy a folder which is already on s3 and save it with different name on S3 in laravel 5.4 . What I have found so far is that I can copy an Image, Not folder. I have tried to copy folder like ie:



 $disk->copy("admin/form/$old_form","admin/form/$new_form");


But it doesnot work like that. It give me an error. Do i need to apply loop and get each folder item separately? Like:



$images = $disk->allFiles('admin/form/$id');


Or is there any work around available in laravel or s3 api it self?



Please help, Its driving me crazy.



Thanks in advance.










share|improve this question














I am trying to copy a folder which is already on s3 and save it with different name on S3 in laravel 5.4 . What I have found so far is that I can copy an Image, Not folder. I have tried to copy folder like ie:



 $disk->copy("admin/form/$old_form","admin/form/$new_form");


But it doesnot work like that. It give me an error. Do i need to apply loop and get each folder item separately? Like:



$images = $disk->allFiles('admin/form/$id');


Or is there any work around available in laravel or s3 api it self?



Please help, Its driving me crazy.



Thanks in advance.







php laravel amazon-web-services laravel-5 amazon-s3






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jul 4 '17 at 8:18









Faran KhanFaran Khan

2511415




2511415













  • Seriously no one?

    – Faran Khan
    Jul 5 '17 at 6:01











  • first, s3 doesnt have the concept of folders, so copy/rename a folder wont work github.com/thephpleague/flysystem-aws-s3-v3/issues/128 , funny that delete do work. for laravel it doesnt allow copying folder but you can still use laravel-recipes.com/recipes/148/… which needs the full path for both old & new to make the copy or you would get an error. i've tried with local successfully but havent tested with s3 yet

    – ctf0
    Dec 20 '17 at 1:50



















  • Seriously no one?

    – Faran Khan
    Jul 5 '17 at 6:01











  • first, s3 doesnt have the concept of folders, so copy/rename a folder wont work github.com/thephpleague/flysystem-aws-s3-v3/issues/128 , funny that delete do work. for laravel it doesnt allow copying folder but you can still use laravel-recipes.com/recipes/148/… which needs the full path for both old & new to make the copy or you would get an error. i've tried with local successfully but havent tested with s3 yet

    – ctf0
    Dec 20 '17 at 1:50

















Seriously no one?

– Faran Khan
Jul 5 '17 at 6:01





Seriously no one?

– Faran Khan
Jul 5 '17 at 6:01













first, s3 doesnt have the concept of folders, so copy/rename a folder wont work github.com/thephpleague/flysystem-aws-s3-v3/issues/128 , funny that delete do work. for laravel it doesnt allow copying folder but you can still use laravel-recipes.com/recipes/148/… which needs the full path for both old & new to make the copy or you would get an error. i've tried with local successfully but havent tested with s3 yet

– ctf0
Dec 20 '17 at 1:50





first, s3 doesnt have the concept of folders, so copy/rename a folder wont work github.com/thephpleague/flysystem-aws-s3-v3/issues/128 , funny that delete do work. for laravel it doesnt allow copying folder but you can still use laravel-recipes.com/recipes/148/… which needs the full path for both old & new to make the copy or you would get an error. i've tried with local successfully but havent tested with s3 yet

– ctf0
Dec 20 '17 at 1:50












3 Answers
3






active

oldest

votes


















3














I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).



Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.



$filesystem = new FilesystemManager(app());
$s3 = $filesystem->disk('s3');
$images = $s3->allFiles('old-folder');

$s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
foreach($images as $image)
{
$new_loc = str_replace('old-folder', 'new-folder', $image);
$s3->copy($image, $new_loc);
}





share|improve this answer
























  • Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

    – Faran Khan
    Jul 11 '17 at 6:03



















0














Another option:



$files = Storage::disk('s3')->allFiles("old/location");
foreach($files as $file){
$copied_file = str_replace("old/location", "new/location", $file);
if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
}





share|improve this answer































    0














    I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.



    Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:



    $source = 's3://abc';
    $destination = 's3://xyz';
    shell_exec('aws s3 sync ' . $source . ' ' . $destination);


    If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.






    share|improve this answer























      Your Answer






      StackExchange.ifUsing("editor", function () {
      StackExchange.using("externalEditor", function () {
      StackExchange.using("snippets", function () {
      StackExchange.snippets.init();
      });
      });
      }, "code-snippets");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "1"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f44900585%2faws-s3-copy-and-replicate-folder-in-laravel%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      3














      I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).



      Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.



      $filesystem = new FilesystemManager(app());
      $s3 = $filesystem->disk('s3');
      $images = $s3->allFiles('old-folder');

      $s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
      foreach($images as $image)
      {
      $new_loc = str_replace('old-folder', 'new-folder', $image);
      $s3->copy($image, $new_loc);
      }





      share|improve this answer
























      • Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

        – Faran Khan
        Jul 11 '17 at 6:03
















      3














      I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).



      Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.



      $filesystem = new FilesystemManager(app());
      $s3 = $filesystem->disk('s3');
      $images = $s3->allFiles('old-folder');

      $s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
      foreach($images as $image)
      {
      $new_loc = str_replace('old-folder', 'new-folder', $image);
      $s3->copy($image, $new_loc);
      }





      share|improve this answer
























      • Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

        – Faran Khan
        Jul 11 '17 at 6:03














      3












      3








      3







      I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).



      Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.



      $filesystem = new FilesystemManager(app());
      $s3 = $filesystem->disk('s3');
      $images = $s3->allFiles('old-folder');

      $s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
      foreach($images as $image)
      {
      $new_loc = str_replace('old-folder', 'new-folder', $image);
      $s3->copy($image, $new_loc);
      }





      share|improve this answer













      I'm in the middle of doing this same thing. Based on what I've read so far, copying a directory itself using Laravel doesn't seem to be possible. The suggestions I've seen so far suggest looking through and copying each image, however I'm not at all satisfied with the speed (since I'm doing this on lots of images several times a day).



      Note that I'm only using the Filesystem directly like this so I can more easily access the methods in PHP Storm. $s3 = Storage::disk('s3'); would accomplish the same thing as my first two lines. I'll update this answer if I find anything that works more quickly.



      $filesystem = new FilesystemManager(app());
      $s3 = $filesystem->disk('s3');
      $images = $s3->allFiles('old-folder');

      $s3->deleteDirectory('new_folder'); // If the file already exists, it will throw an exception. In my case I'm deleting the entire folder to simplify things.
      foreach($images as $image)
      {
      $new_loc = str_replace('old-folder', 'new-folder', $image);
      $s3->copy($image, $new_loc);
      }






      share|improve this answer












      share|improve this answer



      share|improve this answer










      answered Jul 8 '17 at 1:16









      MMMTroyMMMTroy

      883816




      883816













      • Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

        – Faran Khan
        Jul 11 '17 at 6:03



















      • Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

        – Faran Khan
        Jul 11 '17 at 6:03

















      Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

      – Faran Khan
      Jul 11 '17 at 6:03





      Yes. This is how I achived it as well. But it takes lot of time. I wish there was another way to directly copy folder.

      – Faran Khan
      Jul 11 '17 at 6:03













      0














      Another option:



      $files = Storage::disk('s3')->allFiles("old/location");
      foreach($files as $file){
      $copied_file = str_replace("old/location", "new/location", $file);
      if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
      }





      share|improve this answer




























        0














        Another option:



        $files = Storage::disk('s3')->allFiles("old/location");
        foreach($files as $file){
        $copied_file = str_replace("old/location", "new/location", $file);
        if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
        }





        share|improve this answer


























          0












          0








          0







          Another option:



          $files = Storage::disk('s3')->allFiles("old/location");
          foreach($files as $file){
          $copied_file = str_replace("old/location", "new/location", $file);
          if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
          }





          share|improve this answer













          Another option:



          $files = Storage::disk('s3')->allFiles("old/location");
          foreach($files as $file){
          $copied_file = str_replace("old/location", "new/location", $file);
          if(!Storage::disk('s3')->exists($copied_file)) Storage::disk('s3')->copy($file, $copied_file);
          }






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Apr 26 '18 at 14:06









          spedleyspedley

          2,48021917




          2,48021917























              0














              I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.



              Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:



              $source = 's3://abc';
              $destination = 's3://xyz';
              shell_exec('aws s3 sync ' . $source . ' ' . $destination);


              If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.






              share|improve this answer




























                0














                I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.



                Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:



                $source = 's3://abc';
                $destination = 's3://xyz';
                shell_exec('aws s3 sync ' . $source . ' ' . $destination);


                If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.






                share|improve this answer


























                  0












                  0








                  0







                  I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.



                  Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:



                  $source = 's3://abc';
                  $destination = 's3://xyz';
                  shell_exec('aws s3 sync ' . $source . ' ' . $destination);


                  If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.






                  share|improve this answer













                  I've found a faster way to do this is by utilizing aws command line tools, specifically the aws s3 sync command.



                  Once installed on your system, you can invoke from within your Laravel project using shell_exec - example:



                  $source = 's3://abc';
                  $destination = 's3://xyz';
                  shell_exec('aws s3 sync ' . $source . ' ' . $destination);


                  If your set your AWS_KEY and AWS_SECRET in your .env file, the aws command will refer to these values when invoked from within Laravel.







                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Nov 14 '18 at 1:42









                  SusanSusan

                  1,03341735




                  1,03341735






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Stack Overflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f44900585%2faws-s3-copy-and-replicate-folder-in-laravel%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Bressuire

                      Vorschmack

                      Quarantine