das-c01 dumps information want to be introduced to an Amazon SageMaker endpoint to flag suspected fraud. The quantity of enter statistics wished for the inference will be as an awful lot as 1.five MB. Which answer meets the necessities with the LOWEST normal latency? a) Create an Amazon Managed Streaming for Kafka cluster and ingest the das-c01 dumps statistics for every order right into a subject matter. Use a Kafka purchaser jogging on Amazon EC2 times to study those messages and invoke the Amazon SageMaker endpoint. b) Create an Amazon Kinesis Data Streams move and ingest the statistics for every order into the move. Create an AWS Lambda characteristic to study those messages and invoke the Amazon SageMaker endpoint. c) Create an Amazon Kinesis Data Firehose shipping move and ingest the statistics for every order into the move. Configure Kinesis Data Firehose to supply
Suiew1955
1 Blog posts