Amazon AWS Certified SAP on AWS - Specialty PAS-C01 Exam Practice Questions (P. 1)
- Full Access (130 questions)
- Six months of Premium Access
- Access to one million comments
- Seamless ChatGPT Integration
- Ability to download PDF files
- Anki Flashcard files for revision
- No Captcha & No AdSense
- Advanced Exam Configuration
Question #1
A global enterprise is running SAP ERP Central Component (SAP ECC) workloads on Oracle in an on-premises environment. The enterprise plans to migrate to SAP S/4HANA on AWS.
The enterprise recently acquired two other companies. One of the acquired companies is running SAP ECC on Oracle as its ERP system. The other acquired company is running an ERP system that is not from SAP. The enterprise wants to consolidate the three ERP systems into one ERP system on SAP S/4HANA on AWS. Not all the data from the acquired companies needs to be migrated to the final ERP system. The enterprise needs to complete this migration with a solution that minimizes cost and maximizes operational efficiency.
Which solution will meet these requirements?
The enterprise recently acquired two other companies. One of the acquired companies is running SAP ECC on Oracle as its ERP system. The other acquired company is running an ERP system that is not from SAP. The enterprise wants to consolidate the three ERP systems into one ERP system on SAP S/4HANA on AWS. Not all the data from the acquired companies needs to be migrated to the final ERP system. The enterprise needs to complete this migration with a solution that minimizes cost and maximizes operational efficiency.
Which solution will meet these requirements?
- APerform a lift-and-shift migration of all the systems to AWS. Migrate the ERP system that is not from SAP to SAP ECC. Convert all three systems to SAP S/4HANA by using SAP Software Update Manager (SUM) Database Migration Option (DMO). Consolidate all three SAP S/4HANA systems into a final SAP S/4HANA system. Decommission the other systems.
- BPerform a lift-and-shift migration of all the systems to AWS. Migrate the enterprise's initial system to SAP HANA, and then perform a conversion to SAP S/4HANA. Consolidate the two systems from the acquired companies with this SAP S/4HANA system by using the Selective Data Transition approach with SAP Data Management and Landscape Transformation (DMLT).
- CUse SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move to re-architect the enterprise’s initial system to SAP S/4HANA and to change the platform to AWS. Consolidate the two systems from the acquired companies with this SAP S/4HANA system by using the Selective Data Transition approach with SAP Data Management and Landscape Transformation (DMLT).Most Voted
- DUse SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move to re-architect all the systems to SAP S/4HANA and to change the platform to AWS. Consolidate all three SAP S/4HANA systems into a final SAP S/4HANA system. Decommission the other systems.
Correct Answer:
A
A

The correct approach when migrating to SAP S/4HANA on AWS involves using the SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move. This tool effectively re-architects the existing SAP ECC systems and moves them to AWS. It's essential to integrate a Selective Data Transition strategy, ensuring that only the necessary data is migrated, which enhances efficiency and reduces costs. This strategy, combined with SAP Data Management and Landscape Transformation (DMLT), supports efficient data consolidation from multiple sources into a unified SAP S/4HANA system. This approach is pragmatic and cost-effective, considering not all data from the acquired companies is required in the new system.
send
light_mode
delete
Question #2
A global retail company is running its SAP landscape on AWS. Recently, the company made changes to its SAP Web Dispatcher architecture. The company added an additional SAP Web Dispatcher for high availability with an Application Load Balancer (ALB) to balance the load between the two SAP Web Dispatchers.
When users try to access SAP through the ALB, the system is reachable. However, the SAP backend system is showing an error message. An investigation reveals that the issue is related to SAP session handling and distribution of requests. The company confirmed that the system was working as expected with one SAP Web Dispatcher. The company replicated the configuration of that SAP Web Dispatcher to the new SAP Web Dispatcher.
How can the company resolve the error?
When users try to access SAP through the ALB, the system is reachable. However, the SAP backend system is showing an error message. An investigation reveals that the issue is related to SAP session handling and distribution of requests. The company confirmed that the system was working as expected with one SAP Web Dispatcher. The company replicated the configuration of that SAP Web Dispatcher to the new SAP Web Dispatcher.
How can the company resolve the error?
- AMaintain persistence by using session cookies. Enable session stickiness (session affinity) on the SAP Web Dispatchers by setting the wdisp/HTTP/esid_support parameter to True.
- BMaintain persistence by using session cookies. Enable session stickiness (session affinity) on the ALB.Most Voted
- CTurn on host-based routing on the ALB to route traffic between the SAP Web Dispatchers.
- DTurn on URL-based routing on the ALB to route traffic to the application based on URL.
Correct Answer:
C
C

The correct approach in this scenario involves host-based routing on the Application Load Balancer (ALB), which allows for proper traffic distribution between both SAP Web Dispatchers regardless of user session information. By employing host-based routing, each request can be intelligently directed to the appropriate dispatcher based on the host header information, helping to mitigate errors related to SAP session handling and distribution that occurred after introducing the second dispatcher. This ensures load is balanced effectively while maintaining system integrity and availability.
send
light_mode
delete
Question #3
A company hosts its SAP NetWeaver workload on SAP HANA in the AWS Cloud. The SAP NetWeaver application is protected by a cluster solution that uses Red Hat Enterprise Linux. High Availability Add-On. The cluster solution uses an overlay IP address to ensure that the high availability cluster is still accessible during failover scenarios.
An SAP solutions architect needs to facilitate the network connection to this overlay IP address from multiple locations. These locations include more than 25 VPCs, other AWS Regions, and the on-premises environment. The company already has set up an AWS Direct Connect connection between the on-premises environment and AWS.
What should the SAP solutions architect do to meet these requirements in the MOST scalable manner?
An SAP solutions architect needs to facilitate the network connection to this overlay IP address from multiple locations. These locations include more than 25 VPCs, other AWS Regions, and the on-premises environment. The company already has set up an AWS Direct Connect connection between the on-premises environment and AWS.
What should the SAP solutions architect do to meet these requirements in the MOST scalable manner?
- AUse VPC peering between the VPCs to route traffic between them.
- BUse AWS Transit Gateway to connect the VPCs and on-premises networks together.Most Voted
- CUse a Network Load Balancer to route connections to various targets within VPCs.
- DDeploy a Direct Connect gateway to connect the Direct Connect connection over a private VIF to one or more VPCs in any accounts.
Correct Answer:
D
D

The correct approach to connect multiple VPCs and on-premises networks in a scalable manner is indeed through AWS Transit Gateway. It serves as a network transit hub, allowing the interconnection of thousands of VPCs across different AWS Regions and on-premises environments. This method streamlines network management without requiring multiple, complex peering arrangements and provides enhanced security and performance for accessing SAP NetWeaver workloads on SAP HANA from various locations. This eliminates the need for numerous Direct Connect connections or private VIFs that would limit scalability and increase complexity.
send
light_mode
delete
Question #4
A company is implementing SAP HANA on AWS. According to the company’s security policy, SAP backups must be encrypted. Only authorized team members can have the ability to decrypt the SAP backups.
What is the MOST operationally efficient solution that meets these requirements?
What is the MOST operationally efficient solution that meets these requirements?
- AConfigure AWS Backint Agent for SAP HANA to create SAP backups in an Amazon S3 bucket. After a backup is created, encrypt the backup by using client-side encryption. Share the encryption key with authorized team members only.
- BConfigure AWS Backint Agent for SAP HANA to use AWS Key Management Service (AWS KMS) for SAP backups. Create a key policy to grant decryption permission to authorized team members only.Most Voted
- CConfigure AWS Storage Gateway to transfer SAP backups from a file system to an Amazon S3 bucket. Use an S3 bucket policy to grant decryption permission to authorized team members only.
- DConfigure AWS Backint Agent for SAP HANA to use AWS Key Management Service (AWS KMS) for SAP backups. Grant object ACL decryption permission to authorized team members only.
Correct Answer:
C
C

The use of AWS Backint Agent for SAP HANA with AWS Key Management Service (KMS) is indeed the most operationally efficient method to meet the encryption requirements for SAP backups as outlined. AWS KMS simplifies the management of encryption keys while ensuring that only authorized personnel can access the decryption keys, upholding the company’s strict security policies. By integrating KMS into the backup process, businesses can streamline operations and ensure that sensitive data management complies with security standards, all without the additional overhead of manual encryption processes. This method leverages AWS's robust security features and automation capabilities, making it an optimal solution.
send
light_mode
delete
Question #5
A data analysis company has two SAP landscapes that consist of sandbox, development, QA, pre-production, and production servers. One landscape is on Windows, and the other landscape is on Red Hat Enterprise Linux. The servers reside in a room in a building that other tenants share.
An SAP solutions architect proposes to migrate the SAP applications to AWS. The SAP solutions architect wants to move the production backups to AWS and wants to make the backups highly available to restore in case of unavailability of an on-premises server.
Which solution will meet these requirements MOST cost-effectively?
An SAP solutions architect proposes to migrate the SAP applications to AWS. The SAP solutions architect wants to move the production backups to AWS and wants to make the backups highly available to restore in case of unavailability of an on-premises server.
Which solution will meet these requirements MOST cost-effectively?
- ATake a backup of the production servers. Implement an AWS Storage Gateway Volume Gateway. Create file shares by using the Storage Gateway Volume Gateway. Copy the backup files to the file shares through NFS and SMB.
- BTake a backup of the production servers. Send those backups to tape drives. Implement an AWS Storage Gateway Tape Gateway. Send the backups to Amazon S3 Standard-Infrequent Access (S3 Standard-IA) through the S3 console. Move the backups immediately to S3 Glacier Deep Archive.
- CImplement a third-party tool to take images of the SAP application servers and database server. Take regular snapshots at 1-hour intervals. Send the snapshots to Amazon S3 Glacier directly through the S3 Glacier console. Store the same images in different S3 buckets in different AWS Regions.
- DTake a backup of the production servers. Implement an Amazon S3 File Gateway. Create file shares by using the S3 File Gateway. Copy the backup files to the file shares through NFS and SMB. Map backup files directly to Amazon S3. Configure an S3 Lifecycle policy to send the backup files to S3 Glacier based on the company’s data retention policy.Most Voted
Correct Answer:
C
C

In the context of the given scenario, where there's a need for high availability and efficient storage of SAP backups on AWS, an optimal solution includes utilizing third-party tools for taking server images combined with regular snapshots. This approach not only secures the data through frequent updates but also enhances availability by distributing the storage across multiple AWS regions, thereby mitigating regional-specific risks. Storing images directly in Amazon S3 Glacier offers a cost-effective solution by leveraging lower storage costs typically associated with infrequent access, which matches well with the necessity of durable and long-term backup storage. This methodology appropriately addresses both the disaster recovery and financial efficiency aspects of the migration.
send
light_mode
delete
All Pages