serverless-heaven / serverless-aws-alias

Alias support for Serverless 1.x

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cannot redeploy when using DynamoDB Streams

arabold opened this issue · comments

In my project I'm using DynamoDB streams to trigger a Lambda. However the Alias plugin seems to prevent a regular redeploy of the project. Only the first deploy worked as expected.

This is the error thrown in CloudWatch during deployment (Checking Stack update progress):

An error occurred: my-project-dev - Export my-project-dev-MyTableStreamArn cannot be updated as it is in use by my-project-dev-dev.

The relevant parts of the serverless.yaml are:

# [...]

functions:
  myLambdaFunc:
    handler: handlers/myLambdaFunc.handler
    timeout: 300
    events:
      # Run once a day (midnight UTC)
      - schedule: cron(0 0 * * ? *)
      # Also trigger if anything relevant in the database changes
      - stream:
          type: dynamodb
          arn:
            Fn::GetAtt:
              - MyTable
              - StreamArn

resources:
  # AWS CloudFormation Template
  Resources:
    MyTable:
      Type: AWS::DynamoDB::Table
      DeletionPolicy: Retain
      Properties:
        StreamSpecification:
          StreamViewType: KEYS_ONLY
        AttributeDefinitions:
          - # [...]
        KeySchema:
          - # [...]
        ProvisionedThroughput:
          ReadCapacityUnits: 1
          WriteCapacityUnits: 1
        GlobalSecondaryIndexes:
          - # [...]

I will have to spend more time figuring out what exactly causes this problem as none of the resources are actually supposed to change during the deploy. But as for right now, it seems I cannot properly use DynamoDB streams as Lambda triggers in combination with the Alias plugin.

Hi @arabold ,

I think the support for DynamoDB streams is yet missing in the alias plugin. The reason why the error occurs is, that the target of the trigger has to be changed to the current alias of the deployment, i.e. the trigger itself has to be moved to the alias stack, so that it is attached to the alias / versioned function.
Technically the plugin has to be changed, so that dynamo triggers are exactly handled like the kinesis triggers (see stackOps/events). As soon as the trigger is moved to the alias stack, its target has to be set to the logical id of the alias resource - which in turn will be automatically be converted to the aliased function version.

To begin alanyzing the issue, you can utilize serverless package and inspect the generated main and alias templates in .serverless. Then you see what has actually to be moved by the plugin.

I checked the implementation now in detail and read the DynamoDB stream documentation.

The reason for this issue is, that on a subsequent deploy the DynamoDB stream Arn might (and will) change. So the value of the generated Stream output variable changes too.
But that fails, because the alias stack currently references the output (Arn) via Fn::ImportValue which does not allow changing the underlying value.