Lambda Function Trigger Enabled Using Code Pipeline.

Why are you doing a lambda function trigger enabled using pipeline?

For the AWS service to invoke your function directly, you need to create a trigger using the Lambda console. A trigger is a resource you configure to allow another AWS service to invoke your function when certain events or conditions occur. Your function can have multiple triggers. Each trigger acts as a client invoking your function independently, and each event that Lambda passes to your function has data from only one trigger. By using the code pipeline we enabled our lambda function trigger when we needed it.

What is the benefit?

People don’t need to add lambda function roles permission manually and don’t need to enable trigger manually because, after policy gets attached to the particular roles then we can enable trigger and it happens by using pipeline whenever we need every time automation happens.

My Personal Experience

Whenever I got the task to enable trigger on lambda function so, every time I did it manually because in my client side server less computing works that’s why there are so many  lambda functions. And, Every time I did it manually, I discussed this issue with my manager and after referring to some documentation on google I got a solution for that and I created the pipeline for that.

First, we need to create a code pipeline and add the required stage In my code pipeline for the Lambda function trigger enabled I added six stages.


The first stage is required.
1] We need to create S3 bucket and in the bucket, we need to upload some objects.

2] In my scenario, I took the source as a S3 bucket. Mentioned bucket name and also mention object name in the pipeline first stage.

3] Upload the env file as an s3 object because, We have different types of environments so, according to the env file pipeline will work.

4] Once we uploaded an env file as an object on s3 bucket the pipeline automatically triggered.

5] For automatically triggered purpose I mentioned output artifacts as a keyword “latestEnv”

The second stage required.

1] We already created the s3 bucket in the first stage, the same bucket we need to use to run the cloud formation template.

2] In the second stage we need to upload one object as a source in the .zip file format and in the .zip file we have cloudformation template source code for permission on the lambda function roles.

3] In my scenario, I mentioned output artifacts as a keyword “templates.zip”

The third stage required.

1] In the third stage we need an export environment for the lambda function because we have different types of lambda functions.

2] Mentioned some requirement fields like lambda region, Input artifacts as a “latestEnv”, Function name.

The fourth stage required.

1] In the fourth stage we need to create some tags for the lambda function because, sometimes after adding tags it will work for the user.

2] Mentioned some requirement fields like lambda region, Input artifacts as a “latestEnv”, Function name.

The fifth stage required.

1] In the fifth stage we need to deploy cloud formation source code for creating a stack regarding assigning policy to particular lambda roles.

2] For that we need a Mention action provider as a “AWS cloudformation”, region for stack creation, Input artifacts like templates.

3] Action mode we need to take ” Create or update stack” because, everytime pipeline is run, according to the cloudformation template code it will create a new stack or update a new requirement.

4] Stack Name we need Mention so according environment name stack will be create like “kinesis-#{export.envName}”

5] template source code file name we need to mention for code create and update new stack like “kinesis-stream.yaml”

6] I am using below source code with yaml format to run the stack.

AWSTemplateFormatVersion: "2010-09-09"
Description: "This template gives step function access to dsl-platform users."
Parameters:
  EnvName:
    Type: String
Resources:
  SCMIMOSFNP:
    Type: "AWS::IAM::Policy"
    Properties:
      PolicyName: "kinesis-policy"
      PolicyDocument:
        Version: "2012-10-17"
        Statement:
          - Effect: "Allow"
            Action:
              - "kinesis:*"
            Resource: [
              "*"
            ]
      Roles:
        - !Sub "rolename-${EnvName}.mumbai"
        - !Sub "rolename-${EnvName}.mumbai"

The sixth stage required.

1] In that stage we need to create a one step function to enable the lambda function.

2] I am used the below source code with for enable the lambda function.

{
    "StartAt": "UpdateKinesisEventSourceMappingOne",
    "States": {
      "UpdateKinesisEventSourceMappingOne": {
        "Type": "Task",
        "Resource": "arn:aws:states:::aws-sdk:lambda:updateEventSourceMapping",
        "Parameters": {
          "FunctionName": "lambda-envname-mumbai",
          "Uuid": "6d29bbfa-8221-47a3-9b01-f55a7d5894ee3",
          "Enabled": true
        },
        "Next": "UpdateKinesisEventSourceMappingTwo"
      },
      "UpdateKinesisEventSourceMappingTwo": {
        "Type": "Task",
        "Resource": "arn:aws:states:::aws-sdk:lambda:updateEventSourceMapping",
        "Parameters": {
          "FunctionName": "lambda-envname-mumbai",
          "Uuid": "38059928-9083-4c66-87cf-de3f1714567bf",
          "Enabled": true
        },
        "End": true
      }
    }
  }

3] We need to mention Action name as like “Enable-Kinesis-Stream”, Action provider as a AWS Step Function, Region of step function and Input artifacts “latestEnv”

4] State machine ARN we need to mention that we already created.

Conclusion

In conclusion, it is easy to enable the lambda function by using code pipeline every time just upload an env file on a bucket and it triggers automatically. We are not doing this manually because automation reduces the time and above steps we are using each and every lambda function enabled.

References

https://repost.aws/knowledge-center/cloudformation-attach-managed-policy https://docs.aws.amazon.com/step-functions/latest/dg/connect-lambda.html

Blog Pundits: Bhupender Rawat and Sandeep Rawat

OpsTree is an End-to-End DevOps Solution Provider.

Connect with Us

Leave a Reply