Advertisement
Guest User

Untitled

a guest
Dec 15th, 2016
430
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 2.57 KB | None | 0 0
  1. I understand that you are using DynamoDB to store your application configuration which gets updated few times a day, now you are trying to use DynamoDB streams to monitor the changes happening to the configuration items and you are not seeing those changes when the application is deployed to more than one instance.
  2.  
  3. I believe that the issue here is due to that fact that DynamoDB streams have a limitation where only 2 processes at most can read from the same streams shard.
  4.  
  5.  
  6.  
  7. 1. Does the condition that the number of shards should be more than the number of instances also apply to dynamo streams? If so, is there another way to increase the number of shards, and if not, is there a known reason that dynamo streams on small tables fail when read by multiple instances?
  8.  
  9. No, the number of shards logic does not apply here. In DynamoDB, Shards are ephemeral: They are created and deleted automatically, as needed. Any shard can also split into multiple new shards; this also occurs automatically.
  10.  
  11. Again, let me take a look your table to investigate this further.
  12.  
  13. Also please NOTE:
  14.  
  15. * If you perform a PutItem or UpdateItem operation that does not change any data in an item, then DynamoDB Streams will not write a stream record for that operation.
  16.  
  17. * No more than 2 processes at most should be reading from the same Streams shard at the same time. Having more than 2 readers per shard may result in throttling.
  18.  
  19. May be the table is getting read throttling due to the above DynamoDB streams service limitation, since you are reading it from more than 2 sources.
  20.  
  21. 2. Is there a better way to store and watch such configuration (ideally using AWS infrastructure)? I am looking at triggers currently.
  22.  
  23. Triggers together with lambda are good as well.
  24.  
  25. But personally, I feel you should store this in a S3 bucket based on date/period and see if that works out for you.
  26.  
  27. Something like this: s3://configuration-management/year=2015/month=1/day=1/
  28.  
  29. Basically you can store it in a file in plain text,csv or json format on the s3 bucket and from there you can add events to that buckets to track your changes.
  30.  
  31. For example, for all put/post events you can send notifications to SNS topic,SQS Queue or AWS lambda. From there you could do whatever you want, the scope is wide.
  32.  
  33. Reference: http://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html
  34.  
  35. Also, You can do SQL queries against your S3 bucket using our latest service Athena to get information out and store it on a different bucket or export it from the console.
  36.  
  37. http://docs.aws.amazon.com/athena/latest/ug/what-is.html
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement