Welcome to your AWS Free Test - 6 Name Email Phone 1. Question: A company is designing a new service that will run on Amazon EC2 instance behind an Elastic Load Balancer. However, many of the web service clients can only reach IP addresses whitelisted on their firewalls.What should a solution architect recommend to meet the clients' needs?Select 1 option(s): A Network Load Balancer with an associated Elastic IP address. An Application Load Balancer with an a associated Elastic IP address An A record in an Amazon Route 53 hosted zone pointing to an Elastic IP address An EC2 instance with a public IP address running as a proxy in front of the load balancer2. Question:A solutions architect is using Amazon S3 to design the storage architecture of a new digital media application. The media files must be resilient to the loss of an Availability Zone Some files are accessed frequently while other files are rarely accessed in an unpredictable pattern. The solutions architect must minimize the costs of storing and retrieving the media files. Which storage option meets these requirements?Select 1 option(s): S3 Standard S3 Intelligent-Tiering S3 Standard-Infrequent Access {S3 Standard-IA) S3 One Zone-Infrequent Access (S3 One Zone-IA)3. Question: A company has two applications it wants to migrate to AWS. Both applications process a large set of files by accessing the same files at the same time. Both applications need to read the files with low latency. Which architecture should a solutions architect recommend for this situation?Select 1 option(s): Configure two AWS Lambda functions to run the applications. Create an Amazon EC2 instance with an instance store volume to store the data. Configure two AWS Lambda functions to run the applications. Create an Amazon EC2 instance with an Amazon Elastic Block Store (Amazon EBS) volume to store the data. Configure one memory optimized Amazon EC2 instance to run both applications simultaneously. Create an Amazon Elastic Block Store (Amazon EBS) volume with Provisioned IOPS to store the data. Configure two Amazon EC2 instances to run both applications. Configure Amazon Elastic File System (Amazon EFS) with General Purpose performance mode and Bursting Throughput mode to store the data.4. Question:A company has copied 1 PB of data from a colocation facility to an Amazon S3 bucket in the us-east-1 Region using an AWS Direct Connect link. The company now wants to copy the data to another S3 bucket in the us-west-2 Region. The colocation facility does not allow the use AWS Snowball.What should a solutions architect recommend to accomplish this?Select 1 option(s): Order a Snowball Edge device to copy the data from one Region to another Region. Transfer contents from the source S3 bucket to a target S3 bucket using the S3 console. Use the aws S3 sync command to copy data from the source bucket to the destination bucket. Add a cross-Region replication configuration to copy objects across S3 buckets in different Reg.5. Question:A company is migrating a Linux-based web server group to AWS The web servers must access files in a shared file store for some content To meet the migration date, minimal changes can be madeWhat should a solutions architect do to meet these requirements?Select 1 option(s): Create an Amazon S3 Standard bucket with access to the web server. Configure an Amazon CloudFront distribution with an Amazon S3 bucket as the origin Create an Amazon Elastic File System (Amazon EFS) volume and mount it on all web servers Configure Amazon Elastic Block Store (Amazon EBS) Provisioned IOPS SSD (io1) volumes and mount them on all web servers.6. Question: An application uses an Amazon RDS MySQL DB instance. The RDS database is becoming low on disk space. A solutions architect wants to increase the disk space without downtime Which solution meets these requirements with the LEAST amount of effort?Select 1 option(s): Enable storage auto scaling in RDS. Increase the RDS database instance size Change the RDS database instance storage type to Provisioned IOPS. Back up the RDS database, increase the storage capacity, restore the database and stop the previous instance7. Question:A company has a multi-tier application deployed on several Amazon EC2 instances in an Auto Scaling group. An Amazon RDS for Oracle instance is the application'', data layer that uses Oracle-specific PUSQL functions. Traffic to the application has been steadily increasing This is causing the EC2 instances to become overloaded an i RDS instance to run out of storage. The Auto Scaling group does not have any scaling metrics and defines the minimum healthy instance count only. The company predicts that traffic will continue to increase at a steady but unpredictable rate before leveling off. What should a solutions architect do to ensure the system can automatically scale for the increased traffic?Select 2 option(s): Configure storage Auto Scaling on the RDS for Oracle instance. Migrate the database to Amazon Aurora to use Auto Scaling storage Configure an alarm on the RDS for Oracle instance for low free storage space. Configure the Auto Scaling group to use the average CPU as the scaling metric. Configure the Auto Scaling group to use the average free memory as the scaling metric.8. Question:A company has several Amazon EC2 instances set up in a private subnet for security reasons These instances host applications that read and write large amounts of data to and from Amazon S3 regularly. Currently, subnet routing directs all the traffic destined for the internet through a NAT gateway The company wants to optimize the overall cost without impacting the ability of the application to communicate with Amazon S3 or the outside internetWhat should a solutions architect do to optimize costs? Select 1 option(s): Create an additional NAT gateway Update the route table to route to the NAT gateway Update the network ACL to allow S3 traffic Create an internet gateway Update the route table to route traffic to the internet gateway Update the network ACL to allow S3 traffic. Create a VPC endpoint for Amazon S3 Attach an endpoint policy to the endpoint Update the route table to direct traffic to the VPC endpoint Create an AWS Lambda function outside of the VPC to handle S3 requests Attach an IAM policy to the EC2 instances, allowing them to invoke the Lambda function.9. Question: A company recently expanded globally and wants to make its application accessible to users in those geographic locations. The application is deploying on Amazon EC2 instances behind an Application Load balancer in an Auto Scaling group. The company needs the ability shift traffic from resources in one region to another.What should a solutions architect recommend? Select 1 option(s): Configure an Amazon Route 53 latency routing policy Configure an Amazon Route 53 geolocation routing policy Configure an Amazon Route 53 geoproximity fouling policy. Configure an Amazon Route 53 multivalue answer routing policy10. Question:A company receives inconsistent service from its data center provider because the company is headquartered in an area affected by natural disasters. The company is not ready to fully migrate to the AWS Cloud, but it wants a failure environment on AWS in case the on-premises data center fails.The company runs web servers that connect to external vendors. The data available on AWS and on premises must be uniform. Which solution should a solutions architect recommend that has the LEAST amount of downtime?Select 1 option(s): Configure an Amazon Route 53 failover record. Run application servers on Amazon EC2 instances behind an Application Load Balancer in an Auto Scaling group. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3. Configure an Amazon Route 53 failover record. Execute an AWS CloudFormation template from a script to create Amazon EC2 instances behind an Application Load Balancer. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3. Configure an Amazon Route 53 failover record. Set up an AWS Direct Connect connection between a VPC and the data center. Run application servers on Amazon EC2 in an Auto Scaling group. Run an AWS Lambda function to execute an AWS CloudFormation template to create an Application Load Balancer. Configure an Amazon Route 53 failover record. Run an AWS Lambda function to execute an AWS CloudFormation template to launch two Amazon EC2 instances. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3. Set up an AWS Direct Connect connection between a VPC and the data center.11. Question:A company has a 143 TB MySQL database that it wants to migrate to AWS. The plan is to use Amazon Aurora MySQL as the platform going forward. The company has a 100 Mbps AWS Direct Connect connection to Amazon VPC. Which solution meets the company's needs and takes the LEAST amount of time?Select 1 option(s): Use a gateway endpoint for Amazon S3 Migrate the data to Amazon S3 Import the data into Aurora Upgrade the Direct Connect link to 500 Mbps. Copy the data to Amazon S3 Import the data into Aurora Order an AWS Snowmobile and copy the database backup to it. Have AWS import the data into Amazon S3 Import the backup into Aurora Order four 50-TB AWS Snowball devices and copy the database backup onto them. Have AWS import the data into Amazon S3 Import the data into Aurora12. Question:A company is experiencing growth as demand for its product has increased The company's existing purchasing application is slow when traffic spikes The application is a monolithic three tier application that uses synchronous transactions and sometimes sees bottlenecks in the application tier A solutions architect needs to design a solution that can meet required application response times while accounting for traffic volume spikes. Which solution will meet these requirements?Select 1 option(s): Vertically scale the application instance using a larger Amazon EC2 instance size. Scale the application's persistence layer horizontally by introducing Oracle RAC on AWS Scale the web and application tiers horizontally using Auto Scaling groups and an Application Load Balancer Decouple the application and data tiers using Amazon Simple Queue Service (Amazon SQS) with asynchronous AWS Lambda calls.13. Question:A company runs an online marketplace web application on AWS. The application serves hundreds of thousands of users during peak hours. The company needs a scalable, near-real-time solution to share the details of millions of financial transactions with several other internal applications. Transactions also need to be processed to remove sensitive data before being stored in a document database for low-latency retrieval. What should a solutions architect recommend to meet these requirements?Select 1 option(s): Store the transactions data into Amazon DynamoDB. Set up a rule in DynamoDB to remove sensitive data from every transaction upon write. Use DynamoDB Streams to share the transactions data with other applications. Stream the transactions data into Amazon Kinesis Data Firehose to store data in Amazon DynamoDB and Amazon S3. Use AWS Lambda integration with Kinesis Data Firehose to remove sensitive data. Other applications can consume the data stored in Amazon S3. Stream the transactions data into Amazon Kinesis Data Streams. Use AWS Lambda integration to remove sensitive data from every transaction and then store the transactions data in Amazon DynamoDB. Other applications can consume the transactions data off the Kinesis data stream. Store the batched transactions data in Amazon S3 as files. Use AWS Lambda to process every file and remove sensitive data before updating the files in Amazon S3. The Lambda function then stores the data in Amazon DynamoDB. Other applications can consume transaction files stored in Amazon S3.14. Question:A company hosts a training site on a fleet of Amazon EC2 instances. The company anticipates that its new course, which consists of dozens of training videos on the site, will be extremely popular when it is released in 1 week. What should a solutions architect do to minimize the anticipated server load?Select 1 option(s): Store the videos in Amazon ElastiCache for Redis Update the web servers to serve the videos using the Elastic ache API Store the videos in Amazon Elastic File System (Amazon EFS) Create a user data script for the web servers to mount the EFS volume. Store the videos in an Amazon S3 bucket Create an Amazon CloudFlight distribution with an origin access identity (OAI) of that S3 bucket Restrict Amazon S3 access to the OAI. Store the videos in an Amazon S3 bucket. Create an AWS Storage Gateway file gateway to access the S3 bucket Create a user data script for the web servers to mount the file gateway15. Question:A company previously migrated its data warehouse solution to AWS. The company also has an AWS Direct Connect connection. Corporate office users query the data warehouse using a visualization tool. The average size of a query returned by the data warehouse is 50 MB and each webpage sent by the visualization tool is approximately 500 KB. Result sets returned by the data warehouse are not cached.Which solution provides the LOWEST data transfer egress cost for the company?Select 1 option(s): Host the visualization tool on premises and query the data warehouse directly over the internet. Host the visualization tool in the same AWS Region as the data warehouse. Access it over the internet. Host the visualization tool on premises and query the data warehouse directly over a Direct Connect connection at a location in the same AWS Region. Host the visualization tool in the same AWS Region as the data warehouse and access it over a Direct Connect connection at a location in the same Region.16. Question:A development team needs to host a website that will be accessed by other teams. The website contents.consist of HTML, CSS, client side JavaScript, and images.Which method is the MOST cost-effective for hosting the website? Select 1 option(s): Containerize the website and host it in AWS Fargate Create an Amazon S3 bucket and host the website there. Deploy a web server on an Amazon EC2 instance to host the website. Configure an Application Load Balancer with an AWS Lambda target that uses the Express is framework17. Question:A company has an application that generates a large number of files, each approximately 5 MB in size. The files are stored in Amazon S3. Company policy requires the files to be stored for 4 years before they can be deleted. Immediate accessibility is always required as the files contain critical business data that is not easy to reproduce. The files are frequently accessed in the first 30 days of the object creation but are rarely accessed after the first 30 days. Which storage solution is MOST cost-effective?Select 1 option(s): Create an S3 bucket lifecycle policy to move files from S3 Standard to S3 Glacier 30 days from object creation. Delete the files 4 years after object creation. Create an S3 bucket lifecycle policy to move files from S3 Standard to S3 One Zone-Infrequent Access (S3 One Zone-IA) 30 days from object creation. Delete the files 4 years after object creation. Create an S3 bucket lifecycle policy to move files from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days from object creation. Delete the files 4 years after object creation. Create an S3 bucket lifecycle policy to move files from S3 Standard to S3 Standard-Infrequent Access (S3 Standard-IA) 30 days from object creation. Move the files to S3 Glacier 4 years after object creation.18. Question:Company is designing a website that uses an Amazon S3 bucket to store static images. The company wants ail future requests have taster response times while reducing both latency and cost.Which service configuration should a solutions architect recommend? Select 1 option(s): Deploy a NAT server in front of Amazon S3. Deploy Amazon CloudFront in front of Amazon S3. Deploy a Network Load Balancer in front of Amazon S3. Configure Auto Scaling to automatically adjust the capacity of the website.19. Question:A company needs to comply with a regulatory requirement that states all emails must Pe stored and archived externally for 7 years. An administrator has created compressed email files on premises and wants a managed service to transfer the files to AWS storage. Which managed service should a solutions architect recommend?Select 1 option(s): Amazon Elastic File System (Amazon EPS) Amazon S3 Glacier AWS Backup AWS Storage Gateway20. Question:A company that develops web applications has launched hundreds of Application Load Balancers (ALBs) in multiple Regions. The company wants to create an allow list (or the IPs of all the load balancers on its firewall device. A solutions architect is looking for a one-time, highly available solution to address this request, which will also help reduce the number of IPs that need to be allowed by the firewall. What should the solutions architect recommend to meet these requirements?Select 1 option(s): Create a AWS Lambda function to keep track of the IPs for all the ALBs in different Regions Keep refreshing this list. Set up a Network Load Balancer (NLB) with Elastic IPs. Register the private IPs of all the ALBs as targets to this NLB. Launch AWS Global Accelerator and create endpoints for all the Regions. Register all the ALBs in different Regions to the corresponding endpoints Set up an Amazon EC2 instance, assign an Elastic IP to this EC2 instance, and configure the instance as a proxy to forward traffic to all the ALBs.21. Question:A solutions architect is designing the storage architecture for a new web application used for stonng and viewing engineering drawings. All application components will be deployed on the AWS infrastructure.The application design must support caching to minimize the amount of time that users wait for the engineering drawings to load. The application must be able to store petabytes of data. Which combination of storage and caching should the solutions architect use?Select 1 option(s): Amazon S3 with Amazon CloudFront Amazon S3 Glacier with Amazon ElastiCache Amazon Elastic Block Store (Amazon EBS) volumes with Amazon CloudFront AWS Storage Gateway with Amazon ElastiCache22. Question:A company is building a document storage application on AWS. The Application runs on Amazon EC2 instances in multiple Availability Zones. The company requires the document store to be highly available. The documents need to be returned immediately when requested. The lead engineer has configured the application to use Amazon Elastic Block Store (Amazon EBS) to store the documents, but is willing to consider other options to meet the availability requirement. What should a solution architect recommend?Select 1 option(s): Snapshot the EBS volumes regularly and build new volumes using those snapshots in additional Availability Zones. Use Amazon EBS for the EC2 instance root volumes. Configure the application to build the document store on Amazon S3. Use Amazon EBS for the EC2 instance root volumes. Configure the application to build the document store on Amazon S3 Glacier. Use at least three Provisioned IOPS EBS volumes for EC2 instances. Mount the volumes to the EC2 instances in RAID 5 configuration.23. Question:As part of budget planning, management wants a report of AWS billed items listed by user. The data will be used to create department budgets. A solutions architect needs to determine the most efficient way to obtain this report information.Which solution meets these requirements?Select 1 option(s): Run a query with Amazon Athena to generate the report. Create a report in Cost Explorer and download the report. Access the bill details from the billing dashboard and download the bill. Modify a cost budget in AWS Budgets to alert with Amazon Simple Email Service (Amazon SES).24. Question:A company stores call recordings on a monthly basis Statistically, the recorded data may be referenced randomly within a year but accessed rarely after 1 year Files that are newer than 1 year old must be queried and retrieved as quickly as possible. A delay in retrieving older files is acceptable A solutions architect needs to store the recorded data at a minimal cost Which solution is MOST cost-effective?Select 1 option(s): Store individual files in Amazon S3 Glacier and store search metadata in object tags created in S3 Glacier Query S3 Glacier tags and retrieve the files from S3 Glacier Store individual files in Amazon S3 Use lifecycle policies to move the files to Amazon S3 Glacier after 1 year. Query and retrieve the files from Amazon S3 or S3 Glacier. Archive individual files and store search metadata for each archive in Amazon S3 Use lifecycle policies to move the files to Amazon S3 Glacier after 1 year Query and retrieve the files by searching for metadata from Amazon S3 Archive individual files in Amazon S3 Use lifecycle policies to move the files to Amazon S3 Glacier after 1 year Store search metadata in Amazon DynamoDB Query the files from DynamoDB and retrieve them from Amazon S3 or S3 Glacier25. Question:A company has an on-premises business application that generates hundreds of files each day. These files are stored on an SMB file share and require a low-latency connection to the application servers A new company policy states all application-generated files must be copied to AWS There is already a VPN connection to AWSThe application development team does not have time to make the necessary code modifications to move the application to AWSWhich service should a solutions architect recommend to allow the application to copy files to AWS?Select 1 option(s): Amazon Elastic File System (Amazon EFS) Amazon FSx for Windows File Server AWS Snowball AWS Storage Gateway26. Question:A company wants to migrate a workload to AWS. The chief information security officer requires that all data be encrypted at rest when stored in the cloud. The company wants complete control of encryption key lifecycle management.The company must be able to immediately remove the key material and audit key usage independently of AWS CloudTrail. The chosen services should integrate with other storage services that will be used on AWS.Which services satisfies these security requirements? Select 1 option(s): AWS CloudHSM with the CloudHSM client AWS Key Management Service (AWS KMS) with AWS CloudHSM AWS Key Management Service (AWS KMS) with an external key material origin AWS Key Management Service (AWS KMS) with AWS managed customer master keys (CMKs)27. Question:A public-facing web application queries a database hosted on a Amazon EC2 instance in a private subnet. A large number of queries involve multiple table joins, and the application performance has been degrading due to an increase in complex queries. The application team will be performing updates to improve performance. What should a solutions architect recommend to the application team? Select 2 option(s): Cache query data in Amazon SQS Create a read replica to offload queries Migrate the database to Amazon Athena Implement Amazon DynamoDB Accelerator to cache data. Migrate the database to Amazon RDS28. Question: A company wants to build a scalable key management infrastructure to support developers who need to encrypt data in their applications. What should a solutions architect do to reduce the operational burden?Select 1 option(s): Use multi-factor authentication (MFA) to protect the encryption keys Use AWS Key Management Service (AWS KMS) to protect the encryption keys Use AWS Certificate Manager (ACM) to create, store and assign the encryption keys Use an 1AM policy to limit the scope of users who have access permissions to protect the encryption keys29. Question:A company has two AWS accounts Production and Development There are code changes ready in the Development account to push to the Production account In the alpha phase, only two senior developers on the development team need access to the Production account in the beta phase, more developers might need access to perform testing as well.What should a solutions architect recommend?Select 1 option(s): Create two policy documents using the AWS Management Console in each account Assign the policy to developers who need access Create an IAM role in the Development account Give one 1AM role access to the Production account Allow developers to assume the role Create an IAM role in the Production account with the trust policy that specifies the Development account. Allow developers to assume the role. Create an IAM group in the Production account and add it as a principal in the trust policy that specifies the Production account Add developers to the group30. Question:A company has a 10 Gbps AWS Direct Connect connection from its on-premises servers to AWS. The workloads using the connection are critical. The company requires a disaster recovery strategy with maximum resiliency that maintains the current connection bandwidth at a minimum.What should a solutions architect recommend?Select 1 option(s): Set up a new Direct Connect connection in another AWS Region. Set up a new AWS managed VPN connection in another AWS Region. Set up two new Direct Connect connections: one in the current AWS Region and one in another Region. Set up two new AWS managed VPN connections: one in the current AWS Region and one in another Region.31. Question:A company is using Amazon EC2 to run its big data analytics workloads. These variable workloads run each night, and it is critical they finish by the start of business the following day. A solutions architect has been tasked with designing the MOST cost-effective solution. Which solution will accomplish this?Select 1 option(s): Spot Fleet Spot Instances Reserved Instances On-Demand Instances32. Question:A company is developing a mobile game that streams score updates to a backend processor and then posts results on a leaderboard. A solutions architect needs to design a solution that can handle large traffic spikes, process the mobile game updates in order of receipt, and store the processed updates in a highly available database. The company also wants to minimize the management overhead required to maintain the solution. What should the solutions architect do to meet these requirements?Select 1 option(s): Push score updates to Amazon Kinesis Data Streams. Process the updates in Kinesis Data Streams with AWS Lambda. Store the processed updates in Amazon DynamoDB. Push score updates to Amazon Kinesis Data Streams. Process the updates with a fleet of Amazon EC2 instances set up for Auto Scaling. Store the processed updates in Amazon Redshifl. Push score updates to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe an AWS Lambda function to the SNS topic to process the updates. Store the processed updates in a SOL database running on Amazon EC2. Push score updates to an Amazon Simple Queue Service (Amazon SOS) queue. Use a fleet of Amazon EC2 instances with Auto Scaling to process the updates in the SQS queue. Store the processed updates in an Amazon RDS Multi-AZ DB instance.33. Question:A company is running a multi-tier web application on premises. The web application is containerized and runs on a number of Linux hosts connected to a PostgreSQL database that contains user records. The operational overhead of maintaining the infrastructure and capacity planning is limiting the company's growth A solutions architect must improve the application's infrastructure. Which combination of actions should the solutions architect take to accomplish this?Select 2 option(s): Migrate the PostgreSQL database to Amazon Aurora Migrate the web application to be hosted on Amazon EC2 instances. Set up an Amazon CloudFront distribution for the web application content. Set up Amazon ElastiCache between the web application and the PostgreSQL database Migrate the web application to be hosted on AWS Fargate with Amazon Elastic Container Service (Amazon ECS)34. Question:A company stores user data in AWS. The data is used continuously with peak usage during business hours.Access patterns vary, with some data not being used tor months at a time.A solution architect must choose a cost that maintains the highest level ot durability while maintaining high availability.Which storage solution meets these requirements?Select 1 option(s): Amazon S3 Standard Amazon S3 intelligent Tiering Amazon S3 Glacier Deep Archive Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)35. Question: A company is running an online transaction processing (OLTP) workload on AWS. This workload uses an unencrypted Amazon RDS DB instance in a Multi-AZ deployment. Daily database snapshots are taken from this instance.What should a solutions architect do to ensure the database and snapshots are always encrypted moving forward?Select 1 option(s): Encrypt a copy of the latest DB snapshot. Replace existing DB instance by restoring the encrypted snapshot. Create a new encrypted Amazon Elastic Block Store (Amazon EBS) volume and copy the snapshots lo it. Enable encryption on the DB instance. Copy the snapshots and enable encryption using AWS Key Management Service (AWS KMS). Restore encrypted snapshot to an existing DB instance. Copy the snapshots to an Amazon S3 bucket that is encrypted using server-side encryption with AWS Key Management Service (AWS KMS) managed keys (SSE-KMS).36. Question:Management has decided to deploy all AWS VPCs with IPv6 enabled After some time a solutions architect tries to launch a new instance and receives an error stating that there is not enough IP address space available in the subnet What should the solutions architect do to fix this?Select 1 option(s): Check to make sure that only IPv6 was used during the VPC creation. Create a new IPv4 subnet with a larger range, and then launch the instance Create a new IPv6-only subnet with a larger range, and then launch the instance Disable the IPv4 subnet and migrate all instances to IPv6 only Once that is complete launch the instance37. Question:A company has a Microsoft Windows-based application that must be migrated to AWS. This application requires the use of a shared Windows file system attached to multiple Amazon EC2 Windows instances. What should a solution architect do to accomplish this?Select 1 option(s): Configure a volume using Amazon EFS Mount the EPS volume to each Windows Instance Configure AWS Storage Gateway in Volume Gateway mode Mount the volume to each Windows instance Configure Amazon FSx for Windows File Server Mount the Amazon FSx volume to each Windows Instance Configure an Amazon EBS volume with the required size Attach each EC2 instance to the volume Mount the file system within the volume to each Windows instance38. Question:A company hosts its web application on AWS using seven Amazon EC2 instances The company requires that the IP addresses of all healthy EC2 instances be returned in response to DNS queries. Which policy should be used to meet this requirement?Select 1 option(s): Simple routing policy Latency routing policy Multivalue routing policy Geolocation routing policy39. Question:A company is developing a new machine learning model solution in AWS. The models are developed as independent microservices that fetch about 1 GB of model data from Amazon S3 at startup and load the data into memory. Users access the models through an asynchronous API. Users can send a request or a batch of requests and specify where the results should be sent.The company provides models to hundreds of users. The usage patterns for the models are irregular Some models could be unused for days or weeks Other models could receive batches of thousands of requests at a time 'Which solution meets these requirements?Select 1 option(s): The requests from the API are sent to an Application Load Balancer (ALB) Models are deployed as AWS Lambda functions invoked by the ALB. The requests from the API are sent to the models Amazon Simple Queue Service (Amazon SQS) queue. Models are deployed as AWS Lambda functions triggered by SQS events AWS Auto Scaling is enabled on Lambda to increase the number of vCPUs based on the SQS queue size The requests from the API are sent to the model's Amazon Simple Queue Service (Amazon SQS) queue. Models are deployed as Amazon Elastic Container Service (Amazon ECS) services reading from the queue AWS App Mesh scales the instances of the ECS cluster based on the SQS queue size' The requests from the API are sent to the models Amazon Simple Queue Service (Amazon SQS) queueModels are deployed as Amazon Elastic Container Service (Amazon ECS) services reading from the queue AWS Auto Scaling .s enabled on Amazon ECS for both the cluster and copies of the service based on the queue size.40. Question:A company that hosts its web application on AWS wants to ensure all Amazon EC2 instances. Amazon RDS DB instances and Amazon Redshift clusters are configured with tags. The company wants to minimize the effort of configuring and operating this check. What should a solutions architect do to accomplish this''Select 1 option(s): Use AWS Config rules to define and detect resources that are not property tagged Use Cost Explorer to display resources that are not properly tagged Tag those resources manually. Write API calls to check all resources for proper tag allocation. Periodically run the code on an EC2 instance. Write API calls to check all resources for proper tag allocation. Schedule an AWS Lambda function through Amazon CloudWatch to periodically run the code41. Question:A solutions architect is designing a multi-Region disaster recovery solution for an application that will provide public API access. The application will use Amazon EC2 instances with a userdata script to load application code and an Amazon RDS for MySQL database The Recovery Time Objective (RTO) is 3 hours and the Recovery Point Objective (RPO) is 24 hours. Which architecture would meet these requirements at the LOWEST cost?Select 1 option(s): Use an Application Load Balancer for Region failover. Deploy new EC2 instances with the userdata script. Deploy separate RDS instances in each Region Use Amazon Route 53 for Region failover Deploy new EC2 instances with the userdata script Create a read replica of the RDS instance in a backup Region Use Amazon API Gateway for the public APIs and Region failover Deploy new EC2 instances with the userdata script Create a MySQL read replica of the RDS instance in a backup Region Use Amazon Route 53 for Region failover Deploy new EC2 instances with the userdata scnpt for APIs, and create a snapshot of the RDS instance daily for a backup Replicate the snapshot to a backup Region42. Question:A company is deploying an application in three AWS Regions using an Application Load Balancer Amazon Route 53 will be used to distribute traffic between these Regions. Which Route 53 configuration should a solutions architect use to provide the MOST high-performing experience?Select 1 option(s): Create an A record with a latency policy. Create an A record with a geolocation policy. Create a CNAME record with a failover policy. Create a CNAME record with a geoproximity policy.43. Question:A web application must persist order data to Amazon S3 to support neat-real time processing. A solutions architect needs create an architecture that is both scalable and fault tolerant. Which solutions meet these requirements?Select 2 option(s): Write the order event to an Amazon DynamoDB table. Use DynamoDB Streams to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3. Write the order event to an Amazon Simple Queue Service (Amazon SQS) queue. Use the queue to trigger an AWS Lambda function that parsers the payload and writes the data to Amazon S3. Write the order event to an Amazon Simple Notification Service (Amazon SNS) topic. Use the SNS topic to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3 Write the order event to an Amazon Simple Queue Service (Amazon SQS) queue. Use an Amazon EventBridge (Amazon CloudWatch Events) rule to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S3 Write the order event to an Amazon Simple Notification Service (Amazon SNS) topic Use an Amazon EventBridge(Amazon CloudWatch Events) rule to trigger an AWS Lambda function that parses the payload and writes the data to Amazon S344. Question:An administrator of a large company wants to monitor for and prevent any cryptocurrency-related attacks on the company"s AWS accounts Which AWS service can the administrator use to protect the company against attacks?Select 1 option(s): Amazon Cognito Amazon GuardDuty Amazon Inspector Amazon Macie45. Question:A company has no existing file share services. A new project requires access to file storage that is mountable as a drive for on-premises desktops. The file server must authenticate users to an Active Directory domain before they are able to access the storage. Which service will allow Active Directory users to mount storage as a drive on their desktops?Select 1 option(s): Amazon S3 Glacier AWS DataSync AWS Snowball Edge AWS Storage Gateway46. Question:A company is running a two-tier ecommerce website using services. The current architect uses a publish-facing Elastic Load Balancer that sends traffic to Amazon EC2 instances in a private subnet. The static content is hosted on EC2 instances, and the dynamic content is retrieved from a MYSQL database. The application is running in the United States. The company recently started selling to users in Europe and Australia. A solution architect needs to design solution so their international users have an improved browsing experience. Which solution is MOST cost-effective?Select 1 option(s): Host the entire website on Amazon S3. Use Amazon CloudFront and Amazon S3 to host static images. Increase the number of public load balancers and EC2 instances Deploy the two-tier website in AWS Regions in Europe and Austraila.47. Question:A database is on an Amazon RDS MYSQL 5.6 Multi-AZ DB instance that experience highly dynamic reads.Application developers notice a significant slowdown when testing read performance from a secondary AWS Region. The developers want a solution that provides less than 1 second of read replication latency. What should the solutions architect recommend?Select 1 option(s): Install MySQL on Amazon EC2 in (he secondary Region. Migrate the database to Amazon Aurora with cross-Region replicas. Create another RDS for MySQL read replica in the secondary. Implement Amazon ElastiCache to improve database query performance.48. Question:A company is preparing to store confidential data in Amazon S3. For compliance reasons, the data must be encrypted at rest. Encryption key usage must be logged for auditing purposes. Keys must be rotated every year. Which solution meets these requirements and is the MOST operationally efficient?Select 1 option(s): Server-side encryption with customer-provided keys (SSE-C) Server-side encryption with Amazon S3 managed keys (SSE-S3) Server-side encryption with AWS KMS (SSE-KMS) customer master keys (CMKs) with manual rotation Server-side encryption with AWS KMS (SSE-KMS) customer master keys (CMKs) with automatic rotation49. Question:A company hosts its static website content from an Amazon S3 bucket in the us-east-1 Region. Content is made available through an Amazon CloudFront origin pointing to that bucket Cross-Region replication is set up to create a second copy of the bucket in the ap-southeast-1 Region. Management wants a solution that provides greater availability for the website. Which combination of actions should a solutions architect take to increase availability?Select 2 option(s): Add both buckets to the CloudFront origin Configure failover routing in Amazon Route 53 Create a record in Amazon Route 53 pointing to the replica bucket Create an additional CloudFront origin pointing to the ap-southeast-1 bucket Set up a CloudFront origin group with the us-east-1 bucket as the primary and the ap-southeast-1 bucket as the secondary50. Question:A company is using a tape backup solution to store its key application data offsite The daily data volume is around 50 TB The company needs to retain the backups for 7 years for regulatory purposes The backups are rarely accessed and a week's notice is typically given if a backup needs to be restoredThe company is now considering a cloud-based option to reduce the storage costs and operational burden of managing tapes The company also wants to make sure that the transition (rom tape backups to the cloud minimizes disruptions Which storage solution is MOST cost-effective'?Select 1 option(s): Use Amazon Storage Gateway to back up to Amazon Glacier Deep Archive Use AWS Snowball Edge to directly integrate the backups with Amazon S3 Glacier. Copy the backup data to Amazon S3 and create a lifecycle policy to move the data to Amazon S3 Glacier Use Amazon Storage Gateway to back up to Amazon S3 and create a lifecycle policy to move the backup to Amazon S3 GlacierTime is Up!