Amazon AWS Certified Security - Specialty Exam Practice Questions (P. 1)
- Full Access (509 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #1
The Security team believes that a former employee may have gained unauthorized access to AWS resources sometime in the past 3 months by using an identified access key.
What approach would enable the Security team to find out what the former employee may have done within AWS?
What approach would enable the Security team to find out what the former employee may have done within AWS?
- AUse the AWS CloudTrail console to search for user activity.Most Voted
- BUse the Amazon CloudWatch Logs console to filter CloudTrail data by user.
- CUse AWS Config to see what actions were taken by the user.
- DUse Amazon Athena to query CloudTrail logs stored in Amazon S3.
Correct Answer:
A
A

The optimal solution to ascertain what actions a former employee performed on AWS resources is to utilize the AWS CloudTrail console. CloudTrail efficiently records and retains logs of all AWS API calls, providing an accessible method to review these events directly through its console. This approach simplifies the search for any user activity by leveraging the specific access key ID connected to the former employee. Importantly, CloudTrail provides historical data for up to 90 days directly through the console, aligning perfectly with the timeframe of interest in this scenario, making it the most straightforward and effective tool for this investigation.
send
light_mode
delete
Question #2
A company is storing data in Amazon S3 Glacier. The security engineer implemented a new vault lock policy for 10TB of data and called initiate-vault-lock operation 12 hours ago. The audit team identified a typo in the policy that is allowing unintended access to the vault.
What is the MOST cost-effective way to correct this?
What is the MOST cost-effective way to correct this?
- ACall the abort-vault-lock operation. Update the policy. Call the initiate-vault-lock operation again.Most Voted
- BCopy the vault data to a new S3 bucket. Delete the vault. Create a new vault with the data.
- CUpdate the policy to keep the vault lock in place.
- DUpdate the policy. Call initiate-vault-lock operation again to apply the new policy.
Correct Answer:
A
Initiate the lock by attaching a vault lock policy to your vault, which sets the lock to an in-progress state and returns a lock ID. While in the in-progress state, you have 24 hours to validate your vault lock policy before the lock ID expires.
Use the lock ID to complete the lock process. If the vault lock policy doesn't work as expected, you can abort the lock and restart from the beginning. For information on how to use the S3 Glacier API to lock a vault, see Locking a Vault by Using the Amazon S3 Glacier API.
Reference:
https://docs.aws.amazon.com/amazonglacier/latest/dev/vault-lock-policy.html
A
Initiate the lock by attaching a vault lock policy to your vault, which sets the lock to an in-progress state and returns a lock ID. While in the in-progress state, you have 24 hours to validate your vault lock policy before the lock ID expires.
Use the lock ID to complete the lock process. If the vault lock policy doesn't work as expected, you can abort the lock and restart from the beginning. For information on how to use the S3 Glacier API to lock a vault, see Locking a Vault by Using the Amazon S3 Glacier API.
Reference:
https://docs.aws.amazon.com/amazonglacier/latest/dev/vault-lock-policy.html
send
light_mode
delete
Question #3
A company wants to control access to its AWS resources by using identities and groups that are defined in its existing Microsoft Active Directory.
What must the company create in its AWS account to map permissions for AWS services to Active Directory user attributes?
What must the company create in its AWS account to map permissions for AWS services to Active Directory user attributes?
- AAWS IAM groups
- BAWS IAM users
- CAWS IAM rolesMost Voted
- DAWS IAM access keys
Correct Answer:
C
Reference:
https://aws.amazon.com/blogs/security/how-to-connect-your-on-premises-active-directory-to-aws-using-ad-connector/
C
Reference:
https://aws.amazon.com/blogs/security/how-to-connect-your-on-premises-active-directory-to-aws-using-ad-connector/
send
light_mode
delete
Question #4
A company has contracted with a third party to audit several AWS accounts. To enable the audit, cross-account IAM roles have been created in each account targeted for audit. The Auditor is having trouble accessing some of the accounts.
Which of the following may be causing this problem? (Choose three.)
Which of the following may be causing this problem? (Choose three.)
- AThe external ID used by the Auditor is missing or incorrect.Most Voted
- BThe Auditor is using the incorrect password.
- CThe Auditor has not been granted sts:AssumeRole for the role in the destination account.Most Voted
- DThe Amazon EC2 role used by the Auditor must be set to the destination account role.
- EThe secret key used by the Auditor is missing or incorrect.
- FThe role ARN used by the Auditor is missing or incorrect.Most Voted
Correct Answer:
CEF
CEF

The issue with the auditor's access might be tied to the configuration of IAM roles, specifically how they handle permissions and identification for cross-account functions. The essential elements like the external ID (impacting the trust relationship), role ARN (identifying the correct role to assume), and necessary permissions (like sts:AssumeRole) must be correctly configured. These elements collectively ensure that the auditor can securely assume the designated role across different accounts. Misconfigurations in any of these could hinder access, highlighting the importance of accurate setup for effective auditing.
send
light_mode
delete
Question #5
Compliance requirements state that all communications between company on-premises hosts and EC2 instances be encrypted in transit. Hosts use custom proprietary protocols for their communication, and EC2 instances need to be fronted by a load balancer for increased availability.
Which of the following solutions will meet these requirements?
Which of the following solutions will meet these requirements?
- AOffload SSL termination onto an SSL listener on a Classic Load Balancer, and use a TCP connection between the load balancer and the EC2 instances.
- BRoute all traffic through a TCP listener on a Classic Load Balancer, and terminate the TLS connection on the EC2 instances.Most Voted
- CCreate an HTTPS listener using an Application Load Balancer, and route all of the communication through that load balancer.
- DOffload SSL termination onto an SSL listener using an Application Load Balancer, and re-spawn and SSL connection between the load balancer and the EC2 instances.
Correct Answer:
B
B

For communications using custom proprietary protocols, an Application Load Balancer isn't suitable because it only supports HTTP and HTTPS protocols. Using a Classic Load Balancer, which supports TCP, allows the custom protocols to function. Moreover, to comply with the requirement of encrypting data in transit from end to end, the TLS termination must occur at the EC2 instances, not at the load balancer. Option B perfectly aligns with these needs by routing traffic through a TCP listener on a Classic Load Balancer with TLS termination on the EC2 instances, ensuring encrypted and suitably routed traffic.
send
light_mode
delete
Question #6
An application is currently secured using network access control lists and security groups. Web servers are located in public subnets behind an Application Load
Balancer (ALB); application servers are located in private subnets.
How can edge security be enhanced to safeguard the Amazon EC2 instances against attack? (Choose two.)
Balancer (ALB); application servers are located in private subnets.
How can edge security be enhanced to safeguard the Amazon EC2 instances against attack? (Choose two.)
- AConfigure the application's EC2 instances to use NAT gateways for all inbound traffic.
- BMove the web servers to private subnets without public IP addresses.Most Voted
- CConfigure AWS WAF to provide DDoS attack protection for the ALB.Most Voted
- DRequire all inbound network traffic to route through a bastion host in the private subnet.
- ERequire all inbound and outbound network traffic to route through an AWS Direct Connect connection.
Correct Answer:
BC
BC

Moving web servers to private subnets enhances security by reducing direct exposure to the public internet, diminishing potential attack vectors. Pairing this with AWS WAF provides an additional layer of defense by specifically targeting and mitigating common web exploits and DDoS attacks, matching the described efficacy in the AWS WAF documentation regarding the management of web-layer security threats. Together, these measures fortify the edge security around Amazon EC2 instances effectively.
send
light_mode
delete
Question #7
A Security Administrator is restricting the capabilities of company root user accounts. The company uses AWS Organizations and has enabled it for all feature sets, including consolidated billing. The top-level account is used for billing and administrative purposes, not for operational AWS resource purposes.
How can the Administrator restrict usage of member root user accounts across the organization?
How can the Administrator restrict usage of member root user accounts across the organization?
- ADisable the use of the root user account at the organizational root. Enable multi-factor authentication of the root user account for each organizational member account.
- BConfigure IAM user policies to restrict root account capabilities for each Organizations member account.
- CCreate an organizational unit (OU) in Organizations with a service control policy that controls usage of the root user. Add all operational accounts to the new OU.Most Voted
- DConfigure AWS CloudTrail to integrate with Amazon CloudWatch Logs and then create a metric filter for RootAccountUsage.
Correct Answer:
C
Reference:
https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_about-scps.html
C
Reference:
https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_about-scps.html
send
light_mode
delete
Question #8
A Systems Engineer has been tasked with configuring outbound mail through Simple Email Service (SES) and requires compliance with current TLS standards.
The mail application should be configured to connect to which of the following endpoints and corresponding ports?
The mail application should be configured to connect to which of the following endpoints and corresponding ports?
- Aemail.us-east-1.amazonaws.com over port 8080
- Bemail-pop3.us-east-1.amazonaws.com over port 995
- Cemail-smtp.us-east-1.amazonaws.com over port 587Most Voted
- Demail-imap.us-east-1.amazonaws.com over port 993
Correct Answer:
C
Reference:
https://docs.aws.amazon.com/ses/latest/DeveloperGuide/smtp-connect.html
C
Reference:
https://docs.aws.amazon.com/ses/latest/DeveloperGuide/smtp-connect.html
send
light_mode
delete
Question #9
A threat assessment has identified a risk whereby an internal employee could exfiltrate sensitive data from production host running inside AWS (Account 1). The threat was documented as follows:
Threat description: A malicious actor could upload sensitive data from Server X by configuring credentials for an AWS account (Account 2) they control and uploading data to an Amazon S3 bucket within their control.
Server X has outbound internet access configured via a proxy server. Legitimate access to S3 is required so that the application can upload encrypted files to an
S3 bucket. Server X is currently using an IAM instance role. The proxy server is not able to inspect any of the server communication due to TLS encryption.
Which of the following options will mitigate the threat? (Choose two.)
Threat description: A malicious actor could upload sensitive data from Server X by configuring credentials for an AWS account (Account 2) they control and uploading data to an Amazon S3 bucket within their control.
Server X has outbound internet access configured via a proxy server. Legitimate access to S3 is required so that the application can upload encrypted files to an
S3 bucket. Server X is currently using an IAM instance role. The proxy server is not able to inspect any of the server communication due to TLS encryption.
Which of the following options will mitigate the threat? (Choose two.)
- ABypass the proxy and use an S3 VPC endpoint with a policy that whitelists only certain S3 buckets within Account 1.Most Voted
- BBlock outbound access to public S3 endpoints on the proxy server.Most Voted
- CConfigure Network ACLs on Server X to deny access to S3 endpoints.
- DModify the S3 bucket policy for the legitimate bucket to allow access only from the public IP addresses associated with the application server.
- ERemove the IAM instance role from the application server and save API access keys in a trusted and encrypted application config file.
Correct Answer:
AC
AC

To effectively mitigate the threat of an internal employee exfiltrating sensitive data to an unauthorized S3 bucket in AWS, employing both a service control policy via an S3 VPC endpoint along with restrictive network ACLs stands out as a robust strategy. Using an S3 VPC endpoint with tailored policies (Option A) ensures that Server X can only interact with whitelisted S3 buckets within Account 1, significantly reducing the risk of data being routed to external AWS accounts. Additionally, configuring Network ACLs to deny access to unauthorized S3 endpoints (Option C) complements this by blocking potential unwanted outbound connections at the subnet level, contributing an added layer of security to prevent data leakage. Together, these choices fortify the system against the described internal threat while maintaining essential functionality for legitimate operations.
send
light_mode
delete
Question #10
A company will store sensitive documents in three Amazon S3 buckets based on a data classification scheme of `Sensitive,` `Confidential,` and `Restricted.` The security solution must meet all of the following requirements:
✑ Each object must be encrypted using a unique key.
✑ Items that are stored in the `Restricted` bucket require two-factor authentication for decryption.
✑ AWS KMS must automatically rotate encryption keys annually.
Which of the following meets these requirements?
✑ Each object must be encrypted using a unique key.
✑ Items that are stored in the `Restricted` bucket require two-factor authentication for decryption.
✑ AWS KMS must automatically rotate encryption keys annually.
Which of the following meets these requirements?
- ACreate a Customer Master Key (CMK) for each data classification type, and enable the rotation of it annually. For the ג€Restrictedג€ CMK, define the MFA policy within the key policy. Use S3 SSE-KMS to encrypt the objects.Most Voted
- BCreate a CMK grant for each data classification type with EnableKeyRotation and MultiFactorAuthPresent set to true. S3 can then use the grants to encrypt each object with a unique CMK.
- CCreate a CMK for each data classification type, and within the CMK policy, enable rotation of it annually, and define the MFA policy. S3 can then create DEK grants to uniquely encrypt each object within the S3 bucket.
- DCreate a CMK with unique imported key material for each data classification type, and rotate them annually. For the ג€Restrictedג€ key material, define the MFA policy in the key policy. Use S3 SSE-KMS to encrypt the objects.
Correct Answer:
A
A

Option A is the best choice as it fulfills all specified requirements. It uses a Customer Master Key (CMK) for each data classification type, incorporates the necessary annual rotation, and integrates a key policy with two-factor authentication (MFA) specifically for the 'Restricted' bucket. Using S3 SSE-KMS ensures each object is encrypted with a unique key, compliant with the policy standards. Additionally, importing keys in other options is problematic since it complicates the automatic rotation process.
send
light_mode
delete
All Pages