Amazon exam practice test / scs-c01 dumps / scs-c01 dumps pdf / scs-c01 exam / scs-c01 exam dumps / scs-c01 exam questions / scs-c01 pdf

[2021.5] New Valid Amazon SCS-C01 Practice Questions Free Share From Pass4itsure

Amazon AWS SCS-C01 is difficult. But with the Pass4itsure SCS-C01 dumps https://www.pass4itsure.com/aws-certified-security-specialty.html preparation material candidate, it can be achieved easily. In SCS-C01 practice tests, you can practice on the same exam as the actual exam. If you master the tricks you gained through practice, it will be easier to achieve your target score.

Amazon AWS SCS-C01 pdf free https://drive.google.com/file/d/1Bq5cLgqNu9IeOx9rxsLYwn9rEB8AFqDq/view?usp=sharing

Latest Amazon SCS-C01 dumps practice test video tutorial

Latest Amazon AWS SCS-C01 practice exam questions at here:

QUESTION 1
You need to have a cloud security device which would allow to generate encryption keys based on FIPS 140-2 Level 3.
Which of the following can be used for this purpose.
Please select:
A. AWS KMS
B. AWS Customer Keys
C. AWS managed keys
D. AWS Cloud HSM
Correct Answer: AD
AWS Key Management Service (KMS) now uses FIPS 140-2 validated hardware security modules (HSM) and supports
FIPS 140-2 validated endpoints, which provide independent assurances about the confidentiality and integrity of your
keys. All master keys in AWS KMS regardless of their creation date or origin are automatically protected using FIPS
140-2 validated HSMs. defines four levels of security, simply named “Level 1\\’\\’ to “Level 4”. It does not specify in detail
what level of security is required by any particular application. ?FIPS 140-2 Level 1 the lowest, imposes very limited
requirements; loosely, all components must be “production-grade” anc various egregious kinds of insecurity must be
absent ?FIPS 140-2 Level 2 adds requirements for physical tamper-evidence and role-based authentication. ?FIPS
140-2 Level 3 adds requirements for physical tamper-resistance (making it difficult for attackers to gain access to
sensitive information contained in the module) and identity- based authentication, and for a physical or logical
separation between the interfaces by which “critical security parameters” enter and leave the module, and its other
interfaces. ?FIPS 140-2 Level 4 makes the physical security requirements more stringent and requires robustness
against environmental attacks. AWSCIoudHSM provides you with a FIPS 140-2 Level 3 validated single-tenant HSM
cluster in your Amazon Virtual Private Cloud (VPQ to store and use your keys. You have exclusive control over how
your keys are used via an authentication mechanism independent from AWS. You interact with keys in your AWS
CloudHSM cluster similar to the way you interact with your applications running in Amazon EC2. AWS KMS allows you
to create and control the encryption keys used by your applications and supported AWS services in multiple regions
around the world from a single console. The service uses a FIPS 140-2 validated HSM to protect the security of your
keys. Centralized management of all your keys in AWS KMS lets you enforce who can use your keys under which
conditions, when they get rotated, and who can manage them. AWS KMS HSMs are validated at level 2 overall and at
level 3 in the following areas:
1.
Cryptographic Module Specification
2.
Roles, Services, and Authentication
3.
Physical Security
4.
Design Assurance So I think that we can have 2 answers for this question. Both A and D. https://aws.amazon.com/blo15s/security/aws-key-management-service-now-ffers-flps-140-2-validatedcryptographic-m-enabling-easier-adoption-of-theservice-for-regulated-workloads/ https://aws.amazon.com/cloudhsm/faqs/ https://aws.amazon.com/kms/faqs/
https://en.wikipedia.org/wiki/RPS The AWS Documentation mentions the following AWS CloudHSM is a cloud-based
hardware security module (HSM) that enables you to easily generate and use your own encryption keys on the AWS
Cloud. With CloudHSM, you can manage your own encryption keys using FIPS 140-2 Level 3 validated HSMs.
CloudHSM offers you the flexibility to integrate with your applications using industry- standard APIs, such as PKCS#11,
Java Cryptography Extensions () CE). and Microsoft CryptoNG (CNG) libraries. CloudHSM is also standards-compliant
and enables you to export all of your keys to most other commercially-available HSMs. It is a fully-managed service that
automates time-consuming administrative tasks for you, such as hardware provisioning, software patching, highavailability, and backups. CloudHSM also enables you to scale quickly by adding and removing HSM capacity ondemand, with no up-front costs. All other options are invalid since AWS Cloud HSM is the prime service that offers FIPS
140-2 Level 3 compliance For more information on CloudHSM, please visit the following url
https://aws.amazon.com/cloudhsm; The correct answers are: AWS KMS, AWS Cloud HSM

QUESTION 2
A company has a forensic logging use case whereby several hundred applications running on Docker on EC2 need to
send logs to a central location. The Security Engineer must create a logging solution that is able to perform real-time
analytics on the log files, grants the ability to replay events, and persists data.
Which AWS Services, together, can satisfy this use case? (Select two.)
A. Amazon Elasticsearch
B. Amazon Kinesis
C. Amazon SQS
D. Amazon CloudWatch
E. Amazon Athena
Correct Answer: AB
https://docs.aws.amazon.com/whitepapers/latest/aws-overview/analytics.html#amazon-athena

QUESTION 3
A company has an AWS account and allows a third-party contractor, who uses another AWS account, to assume
certain IAM roles. The company wants to ensure that IAM roles can be assumed by the contractor only if the contractor
has multi-factor authentication enabled on their IAM user accounts.
What should the company do to accomplish this?
A. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Deny”,
“Condition” : { “BoolItExists” : { “aws:MultiFactorAuthPresent” : false } }
B. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Deny”,
“Condition” : { “Bool” : { “aws:MultiFactorAuthPresent” : false } }
C. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Allow”,
“Condition” : { “Null” : { “aws:MultiFactorAuthPresent” : false } }
D. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Allow”,
“Condition” : { “BoolItExists” : { “aws:MultiFactorAuthPresent” : false } }
Correct Answer: A
Reference: https://aws-orgs.readthedocs.io/_/downloads/en/latest/pdf/ (18)

QUESTION 4
Your company is planning on using bastion hosts for administering the servers in AWS. Which of the following is the
best description of a bastion host from a security perspective?
Please select:
A. A Bastion host should be on a private subnet and never a public subnet due to security concerns
B. A Bastion host sits on the outside of an internal network and is used as a gateway into the private network and is
considered the critical strong point of the network
C. Bastion hosts allow users to log in using RDP or SSH and use that session to S5H into internal network to access
private subnet resources.
D. A Bastion host should maintain extremely tight security and monitoring as it is available to the public
Correct Answer: C
A bastion host is a special purpose computer on a network specifically designed and configured to withstand attacks.
The computer generally hosts a single application, for example a proxy server, and all other services are removed or
limited to reduce the threat to the computer. In AWS, A bastion host is kept on a public subnet. Users log on to the
bastion host via SSH or RDP and then use that session to manage other hosts in the private subnets. Options A and B
are invalid because the bastion host needs to sit on the public network. Option D is invalid because bastion hosts are
not used for monitoring For more information on bastion hosts, just browse to the below URL:
https://docsaws.amazon.com/quickstart/latest/linux-bastion/architecture.html The correct answer is: Bastion hosts allow
users to log in using RDP or SSH and use that session to SSH into the internal network to access private subnet
resources

QUESTION 5
You have an S3 bucket defined in AWS. You want to ensure that you encrypt the data before sending it across the wire.
What is the best way to achieve this?
Please select:
A. Enable server side encryption for the S3 bucket. This request will ensure that the data is encrypted first.
B. Use the AWS Encryption CLI to encrypt the data first
C. Use a Lambda function to encrypt the data before sending it to the S3 bucket.
D. Enable client encryption for the bucket
Correct Answer: B
One can use the AWS Encryption CLI to encrypt the data before sending it across to the S3 bucket. Options A and C
are invalid because this would still mean that data is transferred in plain text Option D is invalid because you cannot just
enable client-side encryption for the S3 bucket For more information on Encrypting and Decrypting data, please visit the
below URL: https://aws.amazonxom/blogs/securirv/how4o-encrvpt-and-decrypt-your-data-with-the-aws-encryption-cl
The correct answer is: Use the AWS Encryption CLI to encrypt the data first


QUESTION 6
An application is designed to run on an EC2 Instance. The applications need to work with an S3 bucket. From a
security perspective, what is the ideal way for the EC2 instance/ application to be configured?
Please select:
A. Use the AWS access keys ensuring that they are frequently rotated.
B. Assign an IAM user to the application that has specific access to only that S3 bucket
C. Assign an IAM Role and assign it to the EC2 Instance
D. Assign an IAM group and assign it to the EC2 Instance
Correct Answer: C
The below diagram from the AWS whitepaper shows the best security practice of allocating a role that has access to
the S3 bucket

SCS-C01 exam questions-q6

Options A, B, and D are invalid because using users, groups or access keys is invalid security practice when giving
access to resources from other AWS resources. For more information on the Security Best practices, please visit the
following URL: https://d1.awsstatic.com/whitepapers/Security/AWS Security Best Practices.pdl The correct answer is:
Assign an IAM Role and assign it to the EC2 Instance


QUESTION 7
Every application in a company\\’s portfolio has a separate AWS account for development and production. The security
team wants to prevent the root user and all IAM users in the production accounts from accessing a specific set of
unneeded services. How can they control this functionality?
Please select:
A. Create a Service Control Policy that denies access to the services. Assemble all production accounts in an
organizational unit. Apply the policy to that organizational unit.
B. Create a Service Control Policy that denies access to the services. Apply the policy to the root account.
C. Create an IAM policy that denies access to the services. Associate the policy with an IAM group and enlist all users
and the root users in this group.
D. Create an IAM policy that denies access to the services. Create a Config Rule that checks that all users have the
policy m assigned. Trigger a Lambda function that adds the policy when found missing.
Correct Answer: A
As an administrator of the master account of an organization, you can restrict which AWS services and individual API
actions the users and roles in each member account can access. This restriction even overrides the administrators of
member accounts in the organization. When AWS Organizations blocks access to a service or API action for a member
account a user or role in that account can\\’t access any prohibited service or API action, even if an administrator of a
member account explicitly grants such permissions in an IAM policy. Organization permissions overrule account
permissions. Option B is invalid because service policies cannot be assigned to the root account at the account level.
Options C and D are invalid because IAM policies alone at the account level would not be able to suffice the requirement
The correct answer is: Create a Service Control Policy that denies access to the services. Assemble all production
accounts in an organizational unit. Apply the policy to that organizational unit

QUESTION 8
An EC2 Instance hosts a Java-based application that accesses a DynamoDB table. This EC2 Instance is currently serving
production-based users. Which of the following is a secure way of ensuring that the EC2 Instance access the Dynamo
table
Please select:
A. Use IAM Roles with permissions to interact with DynamoDB and assign it to the EC2 Instance
B. Use KMS keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
C. Use IAM Access Keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
D. Use IAM Access Groups with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
Correct Answer: A
To always ensure secure access to AWS resources from EC2 Instances, always ensure to assign a Role to the EC2
Instance Option B is invalid because KMS keys are not used as a mechanism for providing EC2 Instances access to
AWS services. Option C is invalid Access keys is not a safe mechanism for providing EC2 Instances access to AWS
services. Option D is invalid because there is no way access groups can be assigned to EC2 Instances.
For more information on IAM Roles, please refer to the below URL:
https://docs.aws.amazon.com/IAM/latest/UserGuide/idroles.html The correct answer is: Use IAM Roles with
permissions to interact with DynamoDB and assign it to the EC2 Instance

QUESTION 9
A company\\’s Security Officer is concerned about the risk of AWS account root user logins and has assigned a Security
Engineer to implement a notification solution for near-real-time alerts upon account root user logins.
How should the Security Engineer meet these requirements?
A. Create a cron job that runs a script lo download the AWS IAM security credentials We. parse the file for account root
user logins and email the Security team\\’s distribution 1st
B. Run AWS CloudTrail logs through Amazon CloudWatch Events to detect account roo4 user logins and trigger an
AWS Lambda function to send an Amazon SNS notification to the Security team\\’s distribution list.
C. Save AWS CloudTrail logs to an Amazon S3 bucket in the Security team\\’s account Process the CloudTrail logs with
the Security Engineer\\’s logging solution for account root user logins Send an Amazon SNS notification to the Security
team upon encountering the account root user login events
D. Save VPC Plow Logs to an Amazon S3 bucket in the Security team\\’s account and process the VPC Flow Logs with
their logging solutions for account root user logins Send an Amazon SNS notification to the Security team upon
encountering the account root user login events
Correct Answer: B


QUESTION 10
Your company hosts critical data in an S3 bucket. There is a requirement to ensure that all data is encrypted. There is
also metadata about the information stored in the bucket that needs to be encrypted as well. Which of the below
measures would you take to ensure that the metadata is encrypted?
Please select:
A. Put the metadata as metadata for each object in the S3 bucket and then enable S3 Server side encryption.
B. Put the metadata as metadata for each object in the S3 bucket and then enable S3 Server KMS encryption.
C. Put the metadata in a DynamoDB table and ensure the table is encrypted during creation time.
D. Put thp metadata in thp S3 hurkpf itself.
Correct Answer: C
Option A ,B and D are all invalid because the metadata will not be encrypted in any case and this is a key requirement
from the question. One key thing to note is that when the S3 bucket objects are encrypted, the meta data is not
encrypted. So the best option is to use an encrypted DynamoDB table Important All GET and PUT requests for an object protected by AWS KMS will fail if they are not made via SSL or by using SigV4. SSE-KMS encrypts only the
object data. Any object metadata is not encrypted. For more information on using KMS encryption for S3, please refer to
the below URL: https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingKMSEncryption.html The correct answer is: Put
the metadata in a DynamoDB table and ensure the table is encrypted during creation time.

QUESTION 11
A company is hosting a website that must be accessible to users for HTTPS traffic. Also port 22 should be open for
administrative purposes. The administrator\\’s workstation has a static IP address of 203.0.113.1/32. Which of the
following security group configurations are the MOST secure but still functional to support these requirements? Choose
2 answers from the options given below
A. Port 443 coming from 0.0.0.0/0
B. Port 443 coming from 10.0.0.0/16
C. Port 22 coming from 0.0.0.0/0
D. Port 22 coming from 203.0.113.1/32
Correct Answer: AD
Since HTTPS traffic is required for all users on the Internet, Port 443 should be open on all IP addresses. For port 22,
the traffic should be restricted to an internal subnet. Option B is invalid, because this only allow traffic from a particular
CIDR block and not from the internet Option C is invalid because allowing port 22 from the internet is a security risk For
more information on AWS Security Groups, please visit the following URL
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/usins-network-secunty.htmll The correct answers are: Port
443 coming from 0.0.0.0/0, Port 22 coming from 203.0.113.1 /32


QUESTION 12
A company needs to encrypt all of its data stored in Amazon S3. The company wants to use AWS Key Management
Service (AWS KMS) to create and manage its encryption keys. The company\\’s security policies require the ability to
Import the company\\’s own key material for the keys, set an expiration date on the keys, and delete keys immediately, if
needed.
How should a security engineer set up AWS KMS to meet these requirements?
A. Configure AWS KMS and use a custom key store. Create a customer managed CMK with no key material Import the
company\\’s keys and key material into the CMK
B. Configure AWS KMS and use the default Key store Create an AWS managed CMK with no key material Import the
company\\’s key material into the CMK
C. Configure AWS KMS and use the default key store Create a customer managed CMK with no key material import the
company\\’s key material into the CMK
D. Configure AWS KMS and use a custom key store. Create an AWS managed CMK with no key material. Import the
company\\’s key material into the CMK.
Correct Answer: A

QUESTION 13
A company needs to retain log data archives for several years to be compliant with regulations. The log data is no
longer used, but it must be retained.
What is the MOST secure and cost-effective solution to meet these requirements?
A. Archive the data to Amazon S3 and apply a restrictive bucket policy to deny the s3:DeleteObject API.
B. Archive the data to Amazon S3 Glacier and apply a Vault Lock policy.
C. Archive the data to Amazon S3 and replicated it to a second bucket in a second AWS Region. Choose the S3
Standard-Infrequent Access (S3 Standard-IA) storage class and apply a restrictive bucket policy to deny the
s3:DeleteObject API.
D. Migrate the log data to a 16 TB Amazon Elastic Block Store (Amazon EBS) volume. Create a snapshot of the EBS
volume.
Correct Answer: C

Welcome to download the valid Pass4itsure SCS-C01 pdf

Free downloadGoogle Drive
Amazon AWS SCS-C01 pdf https://drive.google.com/file/d/1Bq5cLgqNu9IeOx9rxsLYwn9rEB8AFqDq/view?usp=sharing

Pass4itsure latest Amazon exam dumps coupon code free share

Summary:

New Amazon SCS-C01 exam questions from Pass4itsureĀ SCS-C01 dumps! Welcome to download the newest Pass4itsureĀ SCS-C01 dumps https://www.pass4itsure.com/aws-certified-security-specialty.html (499 Q&As), verified the latest SCS-C01 practice test questions with relevant answers.

Amazon AWS SCS-C01 dumps pdf free share https://drive.google.com/file/d/1Bq5cLgqNu9IeOx9rxsLYwn9rEB8AFqDq/view?usp=sharing