Online AWS-Certified-Database-Specialty Practice TestMore Amazon Products >

Free Amazon AWS-Certified-Database-Specialty Exam Dumps Questions

Amazon AWS-Certified-Database-Specialty: AWS Certified Database - Specialty

- Get instant access to AWS-Certified-Database-Specialty practice exam questions

- Get ready to pass the AWS Certified Database - Specialty exam right now using our Amazon AWS-Certified-Database-Specialty exam package, which includes Amazon AWS-Certified-Database-Specialty practice test plus an Amazon AWS-Certified-Database-Specialty Exam Simulator.

- The best online AWS-Certified-Database-Specialty exam study material and preparation tool is here.

4.5 
(3615 ratings)

Question 1

A financial company wants to store sensitive user data in an Amazon Aurora PostgreSQL DB cluster. The database will be accessed by multiple applications across the company. The company has mandated that all communications to the database be encrypted and the server identity must be validated. Any non-SSL-based connections should be disallowed access to the database.
Which solution addresses these requirements?

Correct Answer:D
PostgreSQL: sslrootcert=rds-cert.pem sslmode=[verify-ca | verify-full]

Question 2

Amazon Neptune is being used by a corporation as the graph database for one of its products. During an ETL procedure, the company's data science team produced enormous volumes of temporary data by unintentionally. The Neptune DB cluster extended its storage capacity automatically to handle the added data, but the data science team erased the superfluous data.
What should a database professional do to prevent incurring extra expenditures for cluster volume space that is not being used?

Correct Answer:C
The only way to shrink the storage space used by your DB cluster when you have a large amount of unused allocated space is to export all the data in your graph and then reload it into a new DB cluster. Creating and restoring a snapshot does not reduce the amount of storage allocated for your DB cluster, because a snapshot retains the original image of the cluster's underlying storage.

Question 3

A business's production databases are housed on a 3 TB Amazon Aurora MySQL DB cluster. The database cluster is installed in the region us-east-1. For disaster recovery (DR) requirements, the company's database expert needs to fast deploy the DB cluster in another AWS Region to handle the production load with an RTO of less than two hours.
Which approach is the MOST OPERATIONALLY EFFECTIVE in meeting these requirements?

Correct Answer:B
RTO is 2 hours. With 3 TB database, cross-region replica is a better option

Question 4

A company has migrated a single MySQL database to Amazon Aurora. The production data is hosted in a DB cluster in VPC_PROD, and 12 testing environments are hosted in VPC_TEST using the same AWS account. Testing results in minimal changes to the test data. The Development team wants each environment refreshed nightly so each test database contains fresh production data every day.
Which migration approach will be the fastest and most cost-effective to implement?

Correct Answer:A

Question 5

A business is operating an on-premises application that is divided into three tiers: web, application, and MySQL database. The database is predominantly accessed during business hours, with occasional bursts of activity throughout the day. As part of the company's shift to AWS, a database expert wants to increase the availability and minimize the cost of the MySQL database tier.
Which MySQL database choice satisfies these criteria?

Correct Answer:B
Amazon Aurora Serverless v1 is a simple, cost-effective option for infrequent, intermittent, or unpredictable workloads. https://aws.amazon.com/rds/aurora/serverless/

Question 6

A company has a heterogeneous six-node production Amazon Aurora DB cluster that handles online transaction processing (OLTP) for the core business and OLAP reports for the human resources department. To match compute resources to the use case, the company has decided to have the reporting workload for the human resources department be directed to two small nodes in the Aurora DB cluster, while every other workload goes to four large nodes in the same DB cluster.
Which option would ensure that the correct nodes are always available for the appropriate workload while meeting these requirements?

Correct Answer:D
https://aws.amazon.com/about-aws/whats-new/2018/11/amazon-aurora-simplifies-workload-management-with-c You can now create custom endpoints for Amazon Aurora databases. This allows you to distribute and load balance workloads across different sets of database instances in your Aurora cluster. For example, you may provision a set of Aurora Replicas to use an instance type with higher memory capacity in order to run an analytics workload. A custom endpoint can then help you route the analytics workload to these
appropriately-configured instances, while keeping other instances in your cluster isolated from this workload. As you add or remove instances from the custom endpoint to match your workload, the endpoint helps spread the load around.

START AWS-Certified-Database-Specialty EXAM