I have an aws pipe which is getting json input from sqs. And the target for the pipe is lambda function. If i remove the filter from the pipe then target lambda is getting invoked. But if i use the filter, nothing works even i cant see anything in monitoring. Please help me on this problem. Note: i have already tested the json event with the filter pattern, and it works. i have tried removing the filter step and it works but i need to filter my json based on filter pattern.
@bhagyalakshmimurugesan Жыл бұрын
Can we use cross account consumer and producer in the pipe?
@edwardgao5469 Жыл бұрын
Hi Laszlo, a quick question, do you know how to reference the event from a SQL statement if the Pipes' target is a Redshift cluster? Thanks
@Majesticcloud Жыл бұрын
Hi, I couldn't get it working by referencing the event parameters from the SQL directly. I am not even sure it's possible, here docs.aws.amazon.com/eventbridge/latest/userguide/eb-pipes-event-target.html in the Dynamic path parameters section says that "RedshiftParameters.Sql can be $.detail.state but it can't be "SELECT * FROM $.detail.state"". However I got it working the following way: in the pipe's enrichment step I've used a Lambda function which returned the query like const response = { query : "INSERT INTO ..." }; then in the SQL statement of the Redshift target just used $.query to reference that query. If all you want is to load data then another way you could try is to save the data into S3 and use the COPY command in the SQL statement to load that S3 file.
@edwardgao5469 Жыл бұрын
@@Majesticcloud Thanks for taking time looking into this problem. We will have a lambda in the enrichment step to pre-process the data before inserting it to Redshift. As you pointed out, event/payload can't be directly referenced in the SQL statement. This is greatly helpful even though I do hope AWS could support that in the future!