How to deploy multiple Lambda's in CFN Template through Codepipeline?





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







1















Usecase



I have a cloudformation Stack with more then 15 Lambdas in it. I can able to deploy the stack through Codepipeline which consists of two stages CodeCommit and CodeDeploy. In this approach all my lambda code is in cloudformation template(i.e.inline code). For Security concerns i want to change this Inline to S3 which inturn requires S3BucketName and S3Key.



As a temporary workaround



As of now i am zipping each lambda file and passing manually S3keyName and bucketname as a parameters to my stack .




Is there any way possible to do this step via Codepipeline ?




My Assumption on CodeBuild



I Know we can use the CodeBuild for it. But upto now i have seen CodeBuild is only used to build package.json file. But in my usecase i dont have any . And also i can see it is possible to specify cloudformation package command to wrap my lambda from local to S3 this command will generate S3 codeuri`, but this is for Serverless Applications where there will be single lambda but in my case i have 15.



What i had tried



I know that as soon as you give a git push to codecommit it will keep you code in S3. So what i thought is to get the S3BucketName and S3KeyName from the codecommit pushed file and pass these parameters to my CFN template. I can able to get the S3BucketName but S3KeyName i dont know how to get that ? And i dont know whether this tried apporach is a workable one ?



BTW i know i can use shell script just to automate this process. But is there a way possible to do it via CODE PIPELINE ?



Update--Tried Serverless Approach



Basically i run two build actions with two different runtimes(i.e.Nodejs,Python) which runs independently. So when i use serverless approach each build will create a template-export.yml file with codeuri of bucketlocation , that means i will have two template-export.yml files . One problem with Serverless approach it must have to create changeset and then it trigger Execute changeset. Because of that i need to merge those two template-export.yml files and run this create changeset action followed by execute changeset. But i didn't know is there a command to merge two SAM templates.Otherwise one template-export.yml stack will replace other template-export.yml stack.



Any help is appreciated
Thanks










share|improve this question































    1















    Usecase



    I have a cloudformation Stack with more then 15 Lambdas in it. I can able to deploy the stack through Codepipeline which consists of two stages CodeCommit and CodeDeploy. In this approach all my lambda code is in cloudformation template(i.e.inline code). For Security concerns i want to change this Inline to S3 which inturn requires S3BucketName and S3Key.



    As a temporary workaround



    As of now i am zipping each lambda file and passing manually S3keyName and bucketname as a parameters to my stack .




    Is there any way possible to do this step via Codepipeline ?




    My Assumption on CodeBuild



    I Know we can use the CodeBuild for it. But upto now i have seen CodeBuild is only used to build package.json file. But in my usecase i dont have any . And also i can see it is possible to specify cloudformation package command to wrap my lambda from local to S3 this command will generate S3 codeuri`, but this is for Serverless Applications where there will be single lambda but in my case i have 15.



    What i had tried



    I know that as soon as you give a git push to codecommit it will keep you code in S3. So what i thought is to get the S3BucketName and S3KeyName from the codecommit pushed file and pass these parameters to my CFN template. I can able to get the S3BucketName but S3KeyName i dont know how to get that ? And i dont know whether this tried apporach is a workable one ?



    BTW i know i can use shell script just to automate this process. But is there a way possible to do it via CODE PIPELINE ?



    Update--Tried Serverless Approach



    Basically i run two build actions with two different runtimes(i.e.Nodejs,Python) which runs independently. So when i use serverless approach each build will create a template-export.yml file with codeuri of bucketlocation , that means i will have two template-export.yml files . One problem with Serverless approach it must have to create changeset and then it trigger Execute changeset. Because of that i need to merge those two template-export.yml files and run this create changeset action followed by execute changeset. But i didn't know is there a command to merge two SAM templates.Otherwise one template-export.yml stack will replace other template-export.yml stack.



    Any help is appreciated
    Thanks










    share|improve this question



























      1












      1








      1








      Usecase



      I have a cloudformation Stack with more then 15 Lambdas in it. I can able to deploy the stack through Codepipeline which consists of two stages CodeCommit and CodeDeploy. In this approach all my lambda code is in cloudformation template(i.e.inline code). For Security concerns i want to change this Inline to S3 which inturn requires S3BucketName and S3Key.



      As a temporary workaround



      As of now i am zipping each lambda file and passing manually S3keyName and bucketname as a parameters to my stack .




      Is there any way possible to do this step via Codepipeline ?




      My Assumption on CodeBuild



      I Know we can use the CodeBuild for it. But upto now i have seen CodeBuild is only used to build package.json file. But in my usecase i dont have any . And also i can see it is possible to specify cloudformation package command to wrap my lambda from local to S3 this command will generate S3 codeuri`, but this is for Serverless Applications where there will be single lambda but in my case i have 15.



      What i had tried



      I know that as soon as you give a git push to codecommit it will keep you code in S3. So what i thought is to get the S3BucketName and S3KeyName from the codecommit pushed file and pass these parameters to my CFN template. I can able to get the S3BucketName but S3KeyName i dont know how to get that ? And i dont know whether this tried apporach is a workable one ?



      BTW i know i can use shell script just to automate this process. But is there a way possible to do it via CODE PIPELINE ?



      Update--Tried Serverless Approach



      Basically i run two build actions with two different runtimes(i.e.Nodejs,Python) which runs independently. So when i use serverless approach each build will create a template-export.yml file with codeuri of bucketlocation , that means i will have two template-export.yml files . One problem with Serverless approach it must have to create changeset and then it trigger Execute changeset. Because of that i need to merge those two template-export.yml files and run this create changeset action followed by execute changeset. But i didn't know is there a command to merge two SAM templates.Otherwise one template-export.yml stack will replace other template-export.yml stack.



      Any help is appreciated
      Thanks










      share|improve this question
















      Usecase



      I have a cloudformation Stack with more then 15 Lambdas in it. I can able to deploy the stack through Codepipeline which consists of two stages CodeCommit and CodeDeploy. In this approach all my lambda code is in cloudformation template(i.e.inline code). For Security concerns i want to change this Inline to S3 which inturn requires S3BucketName and S3Key.



      As a temporary workaround



      As of now i am zipping each lambda file and passing manually S3keyName and bucketname as a parameters to my stack .




      Is there any way possible to do this step via Codepipeline ?




      My Assumption on CodeBuild



      I Know we can use the CodeBuild for it. But upto now i have seen CodeBuild is only used to build package.json file. But in my usecase i dont have any . And also i can see it is possible to specify cloudformation package command to wrap my lambda from local to S3 this command will generate S3 codeuri`, but this is for Serverless Applications where there will be single lambda but in my case i have 15.



      What i had tried



      I know that as soon as you give a git push to codecommit it will keep you code in S3. So what i thought is to get the S3BucketName and S3KeyName from the codecommit pushed file and pass these parameters to my CFN template. I can able to get the S3BucketName but S3KeyName i dont know how to get that ? And i dont know whether this tried apporach is a workable one ?



      BTW i know i can use shell script just to automate this process. But is there a way possible to do it via CODE PIPELINE ?



      Update--Tried Serverless Approach



      Basically i run two build actions with two different runtimes(i.e.Nodejs,Python) which runs independently. So when i use serverless approach each build will create a template-export.yml file with codeuri of bucketlocation , that means i will have two template-export.yml files . One problem with Serverless approach it must have to create changeset and then it trigger Execute changeset. Because of that i need to merge those two template-export.yml files and run this create changeset action followed by execute changeset. But i didn't know is there a command to merge two SAM templates.Otherwise one template-export.yml stack will replace other template-export.yml stack.



      Any help is appreciated
      Thanks







      amazon-web-services aws-code-deploy aws-codepipeline aws-codebuild aws-codecommit






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Dec 8 '18 at 5:35







      Private

















      asked Nov 17 '18 at 4:33









      PrivatePrivate

      514617




      514617
























          2 Answers
          2






          active

          oldest

          votes


















          0














          If I'm understanding you right, you just need an S3 Bucket and Key to be piped into your Lambda CF template. To do this I'm using the ParameterOverrides declaration in my pipeline.



          Essentially, the pipeline is a separate stack and picks up a CF template located in the root of my source. It then overrides two parameters in that template that point it to the appropriate S3 bucket/key.



                  - Name: LambdaDeploy
          Actions:
          - Name: CreateUpdateLambda
          ActionTypeId:
          Category: Deploy
          Owner: AWS
          Provider: CloudFormation
          Version: 1
          Configuration:
          ActionMode: CREATE_UPDATE
          Capabilities: CAPABILITY_IAM
          RoleArn: !GetAtt CloudFormationRole.Arn
          StackName: !Join
          - ''
          - - Fn::ImportValue: !Sub '${CoreStack}ProjectName'
          - !Sub '${ModuleName}-app'
          TemplatePath: SourceOut::cfn-lambda.yml
          ParameterOverrides: '{ "DeploymentBucketName" : { "Fn::GetArtifactAtt" : ["BuildOut", "BucketName"]}, "DeploymentPackageKey": {"Fn::GetArtifactAtt": ["BuildOut", "ObjectKey"]}}'


          Now, the fact that you have fifteen Lambda functions in this might throw a wrench in it. For that I do not exactly have an answer since I'm actually trying to do the exact same thing and package up multiple Lambdas in this kind of way.






          share|improve this answer
























          • Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

            – Private
            Dec 8 '18 at 12:43













          • BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

            – Private
            Dec 8 '18 at 12:49



















          0














          There's documentation on deploying multiple Lambda functions via CodePipeline and CloudFormation here: https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html



          I believe this will still upload the function code to S3, but it will leverage AWS tooling to make this process simpler.






          share|improve this answer
























          • Thanks @TimB for your time. Please check my Update section of my question.

            – Private
            Dec 8 '18 at 5:36












          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53348241%2fhow-to-deploy-multiple-lambdas-in-cfn-template-through-codepipeline%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          If I'm understanding you right, you just need an S3 Bucket and Key to be piped into your Lambda CF template. To do this I'm using the ParameterOverrides declaration in my pipeline.



          Essentially, the pipeline is a separate stack and picks up a CF template located in the root of my source. It then overrides two parameters in that template that point it to the appropriate S3 bucket/key.



                  - Name: LambdaDeploy
          Actions:
          - Name: CreateUpdateLambda
          ActionTypeId:
          Category: Deploy
          Owner: AWS
          Provider: CloudFormation
          Version: 1
          Configuration:
          ActionMode: CREATE_UPDATE
          Capabilities: CAPABILITY_IAM
          RoleArn: !GetAtt CloudFormationRole.Arn
          StackName: !Join
          - ''
          - - Fn::ImportValue: !Sub '${CoreStack}ProjectName'
          - !Sub '${ModuleName}-app'
          TemplatePath: SourceOut::cfn-lambda.yml
          ParameterOverrides: '{ "DeploymentBucketName" : { "Fn::GetArtifactAtt" : ["BuildOut", "BucketName"]}, "DeploymentPackageKey": {"Fn::GetArtifactAtt": ["BuildOut", "ObjectKey"]}}'


          Now, the fact that you have fifteen Lambda functions in this might throw a wrench in it. For that I do not exactly have an answer since I'm actually trying to do the exact same thing and package up multiple Lambdas in this kind of way.






          share|improve this answer
























          • Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

            – Private
            Dec 8 '18 at 12:43













          • BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

            – Private
            Dec 8 '18 at 12:49
















          0














          If I'm understanding you right, you just need an S3 Bucket and Key to be piped into your Lambda CF template. To do this I'm using the ParameterOverrides declaration in my pipeline.



          Essentially, the pipeline is a separate stack and picks up a CF template located in the root of my source. It then overrides two parameters in that template that point it to the appropriate S3 bucket/key.



                  - Name: LambdaDeploy
          Actions:
          - Name: CreateUpdateLambda
          ActionTypeId:
          Category: Deploy
          Owner: AWS
          Provider: CloudFormation
          Version: 1
          Configuration:
          ActionMode: CREATE_UPDATE
          Capabilities: CAPABILITY_IAM
          RoleArn: !GetAtt CloudFormationRole.Arn
          StackName: !Join
          - ''
          - - Fn::ImportValue: !Sub '${CoreStack}ProjectName'
          - !Sub '${ModuleName}-app'
          TemplatePath: SourceOut::cfn-lambda.yml
          ParameterOverrides: '{ "DeploymentBucketName" : { "Fn::GetArtifactAtt" : ["BuildOut", "BucketName"]}, "DeploymentPackageKey": {"Fn::GetArtifactAtt": ["BuildOut", "ObjectKey"]}}'


          Now, the fact that you have fifteen Lambda functions in this might throw a wrench in it. For that I do not exactly have an answer since I'm actually trying to do the exact same thing and package up multiple Lambdas in this kind of way.






          share|improve this answer
























          • Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

            – Private
            Dec 8 '18 at 12:43













          • BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

            – Private
            Dec 8 '18 at 12:49














          0












          0








          0







          If I'm understanding you right, you just need an S3 Bucket and Key to be piped into your Lambda CF template. To do this I'm using the ParameterOverrides declaration in my pipeline.



          Essentially, the pipeline is a separate stack and picks up a CF template located in the root of my source. It then overrides two parameters in that template that point it to the appropriate S3 bucket/key.



                  - Name: LambdaDeploy
          Actions:
          - Name: CreateUpdateLambda
          ActionTypeId:
          Category: Deploy
          Owner: AWS
          Provider: CloudFormation
          Version: 1
          Configuration:
          ActionMode: CREATE_UPDATE
          Capabilities: CAPABILITY_IAM
          RoleArn: !GetAtt CloudFormationRole.Arn
          StackName: !Join
          - ''
          - - Fn::ImportValue: !Sub '${CoreStack}ProjectName'
          - !Sub '${ModuleName}-app'
          TemplatePath: SourceOut::cfn-lambda.yml
          ParameterOverrides: '{ "DeploymentBucketName" : { "Fn::GetArtifactAtt" : ["BuildOut", "BucketName"]}, "DeploymentPackageKey": {"Fn::GetArtifactAtt": ["BuildOut", "ObjectKey"]}}'


          Now, the fact that you have fifteen Lambda functions in this might throw a wrench in it. For that I do not exactly have an answer since I'm actually trying to do the exact same thing and package up multiple Lambdas in this kind of way.






          share|improve this answer













          If I'm understanding you right, you just need an S3 Bucket and Key to be piped into your Lambda CF template. To do this I'm using the ParameterOverrides declaration in my pipeline.



          Essentially, the pipeline is a separate stack and picks up a CF template located in the root of my source. It then overrides two parameters in that template that point it to the appropriate S3 bucket/key.



                  - Name: LambdaDeploy
          Actions:
          - Name: CreateUpdateLambda
          ActionTypeId:
          Category: Deploy
          Owner: AWS
          Provider: CloudFormation
          Version: 1
          Configuration:
          ActionMode: CREATE_UPDATE
          Capabilities: CAPABILITY_IAM
          RoleArn: !GetAtt CloudFormationRole.Arn
          StackName: !Join
          - ''
          - - Fn::ImportValue: !Sub '${CoreStack}ProjectName'
          - !Sub '${ModuleName}-app'
          TemplatePath: SourceOut::cfn-lambda.yml
          ParameterOverrides: '{ "DeploymentBucketName" : { "Fn::GetArtifactAtt" : ["BuildOut", "BucketName"]}, "DeploymentPackageKey": {"Fn::GetArtifactAtt": ["BuildOut", "ObjectKey"]}}'


          Now, the fact that you have fifteen Lambda functions in this might throw a wrench in it. For that I do not exactly have an answer since I'm actually trying to do the exact same thing and package up multiple Lambdas in this kind of way.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Dec 7 '18 at 22:09









          GuestieGuestie

          16




          16













          • Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

            – Private
            Dec 8 '18 at 12:43













          • BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

            – Private
            Dec 8 '18 at 12:49



















          • Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

            – Private
            Dec 8 '18 at 12:43













          • BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

            – Private
            Dec 8 '18 at 12:49

















          Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

          – Private
          Dec 8 '18 at 12:43







          Great @Guestie. I am digging for the same approach. Unfortunately this wont help me. Because, each of my lambda require external dependencies and they need to be kept in the same bucket but with different objectkey. I can see the codebuild has option for multiple output artifacts instead i copied my lambda functions to s3 through build commands (i.e.aws s3 cp ..). Thanks for your time.

          – Private
          Dec 8 '18 at 12:43















          BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

          – Private
          Dec 8 '18 at 12:49





          BTW you need to include InputArtifacts' key in Deploy` stage with values SourceOut & Buildout since you are using these in Deploy stage. Then only your deploy stack will run.

          – Private
          Dec 8 '18 at 12:49













          0














          There's documentation on deploying multiple Lambda functions via CodePipeline and CloudFormation here: https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html



          I believe this will still upload the function code to S3, but it will leverage AWS tooling to make this process simpler.






          share|improve this answer
























          • Thanks @TimB for your time. Please check my Update section of my question.

            – Private
            Dec 8 '18 at 5:36
















          0














          There's documentation on deploying multiple Lambda functions via CodePipeline and CloudFormation here: https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html



          I believe this will still upload the function code to S3, but it will leverage AWS tooling to make this process simpler.






          share|improve this answer
























          • Thanks @TimB for your time. Please check my Update section of my question.

            – Private
            Dec 8 '18 at 5:36














          0












          0








          0







          There's documentation on deploying multiple Lambda functions via CodePipeline and CloudFormation here: https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html



          I believe this will still upload the function code to S3, but it will leverage AWS tooling to make this process simpler.






          share|improve this answer













          There's documentation on deploying multiple Lambda functions via CodePipeline and CloudFormation here: https://docs.aws.amazon.com/lambda/latest/dg/build-pipeline.html



          I believe this will still upload the function code to S3, but it will leverage AWS tooling to make this process simpler.







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 29 '18 at 19:39









          TimBTimB

          73435




          73435













          • Thanks @TimB for your time. Please check my Update section of my question.

            – Private
            Dec 8 '18 at 5:36



















          • Thanks @TimB for your time. Please check my Update section of my question.

            – Private
            Dec 8 '18 at 5:36

















          Thanks @TimB for your time. Please check my Update section of my question.

          – Private
          Dec 8 '18 at 5:36





          Thanks @TimB for your time. Please check my Update section of my question.

          – Private
          Dec 8 '18 at 5:36


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53348241%2fhow-to-deploy-multiple-lambdas-in-cfn-template-through-codepipeline%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Bressuire

          Vorschmack

          Quarantine