[2021.5] New Valid Amazon SCS-C01 Practice Questions Free Share From Pass4itsure

Amazon AWS SCS-C01 is difficult. But with the Pass4itsure SCS-C01 dumps https://www.pass4itsure.com/aws-certified-security-specialty.html preparation material candidate, it can be achieved easily. In SCS-C01 practice tests, you can practice on the same exam as the actual exam. If you master the tricks you gained through practice, it will be easier to achieve your target score.

Amazon AWS SCS-C01 pdf free https://drive.google.com/file/d/1Bq5cLgqNu9IeOx9rxsLYwn9rEB8AFqDq/view?usp=sharing

Latest Amazon SCS-C01 dumps practice test video tutorial

Latest Amazon AWS SCS-C01 practice exam questions at here:

QUESTION 1
You need to have a cloud security device which would allow to generate encryption keys based on FIPS 140-2 Level 3.
Which of the following can be used for this purpose.
Please select:
A. AWS KMS
B. AWS Customer Keys
C. AWS managed keys
D. AWS Cloud HSM
Correct Answer: AD
AWS Key Management Service (KMS) now uses FIPS 140-2 validated hardware security modules (HSM) and supports
FIPS 140-2 validated endpoints, which provide independent assurances about the confidentiality and integrity of your
keys. All master keys in AWS KMS regardless of their creation date or origin are automatically protected using FIPS
140-2 validated HSMs. defines four levels of security, simply named “Level 1\\’\\’ to “Level 4”. It does not specify in detail
what level of security is required by any particular application. ?FIPS 140-2 Level 1 the lowest, imposes very limited
requirements; loosely, all components must be “production-grade” anc various egregious kinds of insecurity must be
absent ?FIPS 140-2 Level 2 adds requirements for physical tamper-evidence and role-based authentication. ?FIPS
140-2 Level 3 adds requirements for physical tamper-resistance (making it difficult for attackers to gain access to
sensitive information contained in the module) and identity- based authentication, and for a physical or logical
separation between the interfaces by which “critical security parameters” enter and leave the module, and its other
interfaces. ?FIPS 140-2 Level 4 makes the physical security requirements more stringent and requires robustness
against environmental attacks. AWSCIoudHSM provides you with a FIPS 140-2 Level 3 validated single-tenant HSM
cluster in your Amazon Virtual Private Cloud (VPQ to store and use your keys. You have exclusive control over how
your keys are used via an authentication mechanism independent from AWS. You interact with keys in your AWS
CloudHSM cluster similar to the way you interact with your applications running in Amazon EC2. AWS KMS allows you
to create and control the encryption keys used by your applications and supported AWS services in multiple regions
around the world from a single console. The service uses a FIPS 140-2 validated HSM to protect the security of your
keys. Centralized management of all your keys in AWS KMS lets you enforce who can use your keys under which
conditions, when they get rotated, and who can manage them. AWS KMS HSMs are validated at level 2 overall and at
level 3 in the following areas:
1.
Cryptographic Module Specification
2.
Roles, Services, and Authentication
3.
Physical Security
4.
Design Assurance So I think that we can have 2 answers for this question. Both A and D. https://aws.amazon.com/blo15s/security/aws-key-management-service-now-ffers-flps-140-2-validatedcryptographic-m-enabling-easier-adoption-of-theservice-for-regulated-workloads/ https://aws.amazon.com/cloudhsm/faqs/ https://aws.amazon.com/kms/faqs/
https://en.wikipedia.org/wiki/RPS The AWS Documentation mentions the following AWS CloudHSM is a cloud-based
hardware security module (HSM) that enables you to easily generate and use your own encryption keys on the AWS
Cloud. With CloudHSM, you can manage your own encryption keys using FIPS 140-2 Level 3 validated HSMs.
CloudHSM offers you the flexibility to integrate with your applications using industry- standard APIs, such as PKCS#11,
Java Cryptography Extensions () CE). and Microsoft CryptoNG (CNG) libraries. CloudHSM is also standards-compliant
and enables you to export all of your keys to most other commercially-available HSMs. It is a fully-managed service that
automates time-consuming administrative tasks for you, such as hardware provisioning, software patching, highavailability, and backups. CloudHSM also enables you to scale quickly by adding and removing HSM capacity ondemand, with no up-front costs. All other options are invalid since AWS Cloud HSM is the prime service that offers FIPS
140-2 Level 3 compliance For more information on CloudHSM, please visit the following url
https://aws.amazon.com/cloudhsm; The correct answers are: AWS KMS, AWS Cloud HSM

QUESTION 2
A company has a forensic logging use case whereby several hundred applications running on Docker on EC2 need to
send logs to a central location. The Security Engineer must create a logging solution that is able to perform real-time
analytics on the log files, grants the ability to replay events, and persists data.
Which AWS Services, together, can satisfy this use case? (Select two.)
A. Amazon Elasticsearch
B. Amazon Kinesis
C. Amazon SQS
D. Amazon CloudWatch
E. Amazon Athena
Correct Answer: AB
https://docs.aws.amazon.com/whitepapers/latest/aws-overview/analytics.html#amazon-athena

QUESTION 3
A company has an AWS account and allows a third-party contractor, who uses another AWS account, to assume
certain IAM roles. The company wants to ensure that IAM roles can be assumed by the contractor only if the contractor
has multi-factor authentication enabled on their IAM user accounts.
What should the company do to accomplish this?
A. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Deny”,
“Condition” : { “BoolItExists” : { “aws:MultiFactorAuthPresent” : false } }
B. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Deny”,
“Condition” : { “Bool” : { “aws:MultiFactorAuthPresent” : false } }
C. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Allow”,
“Condition” : { “Null” : { “aws:MultiFactorAuthPresent” : false } }
D. Add the following condition to the IAM policy attached to all IAM roles:
“Effect”: “Allow”,
“Condition” : { “BoolItExists” : { “aws:MultiFactorAuthPresent” : false } }
Correct Answer: A
Reference: https://aws-orgs.readthedocs.io/_/downloads/en/latest/pdf/ (18)

QUESTION 4
Your company is planning on using bastion hosts for administering the servers in AWS. Which of the following is the
best description of a bastion host from a security perspective?
Please select:
A. A Bastion host should be on a private subnet and never a public subnet due to security concerns
B. A Bastion host sits on the outside of an internal network and is used as a gateway into the private network and is
considered the critical strong point of the network
C. Bastion hosts allow users to log in using RDP or SSH and use that session to S5H into internal network to access
private subnet resources.
D. A Bastion host should maintain extremely tight security and monitoring as it is available to the public
Correct Answer: C
A bastion host is a special purpose computer on a network specifically designed and configured to withstand attacks.
The computer generally hosts a single application, for example a proxy server, and all other services are removed or
limited to reduce the threat to the computer. In AWS, A bastion host is kept on a public subnet. Users log on to the
bastion host via SSH or RDP and then use that session to manage other hosts in the private subnets. Options A and B
are invalid because the bastion host needs to sit on the public network. Option D is invalid because bastion hosts are
not used for monitoring For more information on bastion hosts, just browse to the below URL:
https://docsaws.amazon.com/quickstart/latest/linux-bastion/architecture.html The correct answer is: Bastion hosts allow
users to log in using RDP or SSH and use that session to SSH into the internal network to access private subnet
resources

QUESTION 5
You have an S3 bucket defined in AWS. You want to ensure that you encrypt the data before sending it across the wire.
What is the best way to achieve this?
Please select:
A. Enable server side encryption for the S3 bucket. This request will ensure that the data is encrypted first.
B. Use the AWS Encryption CLI to encrypt the data first
C. Use a Lambda function to encrypt the data before sending it to the S3 bucket.
D. Enable client encryption for the bucket
Correct Answer: B
One can use the AWS Encryption CLI to encrypt the data before sending it across to the S3 bucket. Options A and C
are invalid because this would still mean that data is transferred in plain text Option D is invalid because you cannot just
enable client-side encryption for the S3 bucket For more information on Encrypting and Decrypting data, please visit the
below URL: https://aws.amazonxom/blogs/securirv/how4o-encrvpt-and-decrypt-your-data-with-the-aws-encryption-cl
The correct answer is: Use the AWS Encryption CLI to encrypt the data first


QUESTION 6
An application is designed to run on an EC2 Instance. The applications need to work with an S3 bucket. From a
security perspective, what is the ideal way for the EC2 instance/ application to be configured?
Please select:
A. Use the AWS access keys ensuring that they are frequently rotated.
B. Assign an IAM user to the application that has specific access to only that S3 bucket
C. Assign an IAM Role and assign it to the EC2 Instance
D. Assign an IAM group and assign it to the EC2 Instance
Correct Answer: C
The below diagram from the AWS whitepaper shows the best security practice of allocating a role that has access to
the S3 bucket

SCS-C01 exam questions-q6

Options A, B, and D are invalid because using users, groups or access keys is invalid security practice when giving
access to resources from other AWS resources. For more information on the Security Best practices, please visit the
following URL: https://d1.awsstatic.com/whitepapers/Security/AWS Security Best Practices.pdl The correct answer is:
Assign an IAM Role and assign it to the EC2 Instance


QUESTION 7
Every application in a company\\’s portfolio has a separate AWS account for development and production. The security
team wants to prevent the root user and all IAM users in the production accounts from accessing a specific set of
unneeded services. How can they control this functionality?
Please select:
A. Create a Service Control Policy that denies access to the services. Assemble all production accounts in an
organizational unit. Apply the policy to that organizational unit.
B. Create a Service Control Policy that denies access to the services. Apply the policy to the root account.
C. Create an IAM policy that denies access to the services. Associate the policy with an IAM group and enlist all users
and the root users in this group.
D. Create an IAM policy that denies access to the services. Create a Config Rule that checks that all users have the
policy m assigned. Trigger a Lambda function that adds the policy when found missing.
Correct Answer: A
As an administrator of the master account of an organization, you can restrict which AWS services and individual API
actions the users and roles in each member account can access. This restriction even overrides the administrators of
member accounts in the organization. When AWS Organizations blocks access to a service or API action for a member
account a user or role in that account can\\’t access any prohibited service or API action, even if an administrator of a
member account explicitly grants such permissions in an IAM policy. Organization permissions overrule account
permissions. Option B is invalid because service policies cannot be assigned to the root account at the account level.
Options C and D are invalid because IAM policies alone at the account level would not be able to suffice the requirement
The correct answer is: Create a Service Control Policy that denies access to the services. Assemble all production
accounts in an organizational unit. Apply the policy to that organizational unit

QUESTION 8
An EC2 Instance hosts a Java-based application that accesses a DynamoDB table. This EC2 Instance is currently serving
production-based users. Which of the following is a secure way of ensuring that the EC2 Instance access the Dynamo
table
Please select:
A. Use IAM Roles with permissions to interact with DynamoDB and assign it to the EC2 Instance
B. Use KMS keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
C. Use IAM Access Keys with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
D. Use IAM Access Groups with the right permissions to interact with DynamoDB and assign it to the EC2 Instance
Correct Answer: A
To always ensure secure access to AWS resources from EC2 Instances, always ensure to assign a Role to the EC2
Instance Option B is invalid because KMS keys are not used as a mechanism for providing EC2 Instances access to
AWS services. Option C is invalid Access keys is not a safe mechanism for providing EC2 Instances access to AWS
services. Option D is invalid because there is no way access groups can be assigned to EC2 Instances.
For more information on IAM Roles, please refer to the below URL:
https://docs.aws.amazon.com/IAM/latest/UserGuide/idroles.html The correct answer is: Use IAM Roles with
permissions to interact with DynamoDB and assign it to the EC2 Instance

QUESTION 9
A company\\’s Security Officer is concerned about the risk of AWS account root user logins and has assigned a Security
Engineer to implement a notification solution for near-real-time alerts upon account root user logins.
How should the Security Engineer meet these requirements?
A. Create a cron job that runs a script lo download the AWS IAM security credentials We. parse the file for account root
user logins and email the Security team\\’s distribution 1st
B. Run AWS CloudTrail logs through Amazon CloudWatch Events to detect account roo4 user logins and trigger an
AWS Lambda function to send an Amazon SNS notification to the Security team\\’s distribution list.
C. Save AWS CloudTrail logs to an Amazon S3 bucket in the Security team\\’s account Process the CloudTrail logs with
the Security Engineer\\’s logging solution for account root user logins Send an Amazon SNS notification to the Security
team upon encountering the account root user login events
D. Save VPC Plow Logs to an Amazon S3 bucket in the Security team\\’s account and process the VPC Flow Logs with
their logging solutions for account root user logins Send an Amazon SNS notification to the Security team upon
encountering the account root user login events
Correct Answer: B


QUESTION 10
Your company hosts critical data in an S3 bucket. There is a requirement to ensure that all data is encrypted. There is
also metadata about the information stored in the bucket that needs to be encrypted as well. Which of the below
measures would you take to ensure that the metadata is encrypted?
Please select:
A. Put the metadata as metadata for each object in the S3 bucket and then enable S3 Server side encryption.
B. Put the metadata as metadata for each object in the S3 bucket and then enable S3 Server KMS encryption.
C. Put the metadata in a DynamoDB table and ensure the table is encrypted during creation time.
D. Put thp metadata in thp S3 hurkpf itself.
Correct Answer: C
Option A ,B and D are all invalid because the metadata will not be encrypted in any case and this is a key requirement
from the question. One key thing to note is that when the S3 bucket objects are encrypted, the meta data is not
encrypted. So the best option is to use an encrypted DynamoDB table Important All GET and PUT requests for an object protected by AWS KMS will fail if they are not made via SSL or by using SigV4. SSE-KMS encrypts only the
object data. Any object metadata is not encrypted. For more information on using KMS encryption for S3, please refer to
the below URL: https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingKMSEncryption.html The correct answer is: Put
the metadata in a DynamoDB table and ensure the table is encrypted during creation time.

QUESTION 11
A company is hosting a website that must be accessible to users for HTTPS traffic. Also port 22 should be open for
administrative purposes. The administrator\\’s workstation has a static IP address of 203.0.113.1/32. Which of the
following security group configurations are the MOST secure but still functional to support these requirements? Choose
2 answers from the options given below
A. Port 443 coming from 0.0.0.0/0
B. Port 443 coming from 10.0.0.0/16
C. Port 22 coming from 0.0.0.0/0
D. Port 22 coming from 203.0.113.1/32
Correct Answer: AD
Since HTTPS traffic is required for all users on the Internet, Port 443 should be open on all IP addresses. For port 22,
the traffic should be restricted to an internal subnet. Option B is invalid, because this only allow traffic from a particular
CIDR block and not from the internet Option C is invalid because allowing port 22 from the internet is a security risk For
more information on AWS Security Groups, please visit the following URL
https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/usins-network-secunty.htmll The correct answers are: Port
443 coming from 0.0.0.0/0, Port 22 coming from 203.0.113.1 /32


QUESTION 12
A company needs to encrypt all of its data stored in Amazon S3. The company wants to use AWS Key Management
Service (AWS KMS) to create and manage its encryption keys. The company\\’s security policies require the ability to
Import the company\\’s own key material for the keys, set an expiration date on the keys, and delete keys immediately, if
needed.
How should a security engineer set up AWS KMS to meet these requirements?
A. Configure AWS KMS and use a custom key store. Create a customer managed CMK with no key material Import the
company\\’s keys and key material into the CMK
B. Configure AWS KMS and use the default Key store Create an AWS managed CMK with no key material Import the
company\\’s key material into the CMK
C. Configure AWS KMS and use the default key store Create a customer managed CMK with no key material import the
company\\’s key material into the CMK
D. Configure AWS KMS and use a custom key store. Create an AWS managed CMK with no key material. Import the
company\\’s key material into the CMK.
Correct Answer: A

QUESTION 13
A company needs to retain log data archives for several years to be compliant with regulations. The log data is no
longer used, but it must be retained.
What is the MOST secure and cost-effective solution to meet these requirements?
A. Archive the data to Amazon S3 and apply a restrictive bucket policy to deny the s3:DeleteObject API.
B. Archive the data to Amazon S3 Glacier and apply a Vault Lock policy.
C. Archive the data to Amazon S3 and replicated it to a second bucket in a second AWS Region. Choose the S3
Standard-Infrequent Access (S3 Standard-IA) storage class and apply a restrictive bucket policy to deny the
s3:DeleteObject API.
D. Migrate the log data to a 16 TB Amazon Elastic Block Store (Amazon EBS) volume. Create a snapshot of the EBS
volume.
Correct Answer: C

Welcome to download the valid Pass4itsure SCS-C01 pdf

Free downloadGoogle Drive
Amazon AWS SCS-C01 pdf https://drive.google.com/file/d/1Bq5cLgqNu9IeOx9rxsLYwn9rEB8AFqDq/view?usp=sharing

Pass4itsure latest Amazon exam dumps coupon code free share

Summary:

New Amazon SCS-C01 exam questions from Pass4itsure SCS-C01 dumps! Welcome to download the newest Pass4itsure SCS-C01 dumps https://www.pass4itsure.com/aws-certified-security-specialty.html (499 Q&As), verified the latest SCS-C01 practice test questions with relevant answers.

Amazon AWS SCS-C01 dumps pdf free share https://drive.google.com/file/d/1Bq5cLgqNu9IeOx9rxsLYwn9rEB8AFqDq/view?usp=sharing

[2021.3] Valid Amazon AWS SCS-C01 Practice Questions Free Share From Pass4itsure

Amazon AWS SCS-C01 is difficult. But with the Pass4itsure SCS-C01 dumps https://www.pass4itsure.com/aws-certified-security-specialty.html preparation material candidate, it can be achieved easily. In SCS-C01 practice tests, you can practice on the same exam as the actual exam. If you master the tricks you gained through practice, it will be easier to achieve your target score.

Amazon AWS SCS-C01 pdf free https://drive.google.com/file/d/1JRPXuxAvU2SKyppRM8NVWT0LSCp3gArr/view?usp=sharing

Latest Amazon SCS-C01 dumps Practice test video tutorial

Latest Amazon AWS SCS-C01 practice exam questions at here:

QUESTION 1
Which technique can be used to integrate AWS IAM (Identity and Access Management) with an on-premise LDAP
(Lightweight Directory Access Protocol) directory service?
Please select: A. Use an IAM policy that references the LDAP account identifiers and the AWS credentials.
B. Use SAML (Security Assertion Markup Language) to enable single sign-on between AWS and LDAP.
C. Use AWS Security Token Service from an identity broker to issue short-lived AWS credentials.
D. Use IAM roles to automatically rotate the IAM credentials when LDAP credentials are updated.
Correct Answer: B
On the AWS Blog site the following information is present to help on this context The newly released whitepaper. Single
Sign-On: Integrating AWS, OpenLDAP, and Shibboleth, will help you integrate your existing LDAP-based user directory
with AWS. When you integrate your existing directory with AWS, your users can access AWS by using their existing
credentials. This means that your users don\\’t need to maintain yet another user name and password just to access
AWS resources. Option
A.C and D are all invalid because in this sort of configuration, you have to use SAML to enable single sign
on.
For more information on integrating AWS with LDAP for Single Sign-On, please visit the following URL:
https://aws.amazon.eom/blogs/security/new-whitepaper-sinEle-sign-on-inteErating-aws-openldap-andshibboleth/
The correct answer is: Use SAML (Security Assertion Markup Language) to enable single sign-on between
AWS and LDAP.


QUESTION 2
An application running on EC2 instances in a VPC must call an external web service via TLS (port 443). The instances
run in public subnets.
Which configurations below allow the application to function and minimize the exposure of the instances? Select 2
answers from the options given below
Please select:
A. A network ACL with a rule that allows outgoing traffic on port 443.
B. A network ACL with rules that allow outgoing traffic on port 443 and incoming traffic on ephemeral ports
C. A network ACL with rules that allow outgoing traffic on port 443 and incoming traffic on port 443.
D. A security group with a rule that allows outgoing traffic on port 443
E. A security group with rules that allow outgoing traffic on port 443 and incoming traffic on ephemeral ports.
F. A security group with rules that allow outgoing traffic on port 443 and incoming traffic on port 443.
Correct Answer: BD
Since here the traffic needs to flow outbound from the Instance to a web service on Port 443, the outbound rules on
both the Network and Security Groups need to allow outbound traffic. The Incoming traffic should be allowed on
ephemeral ports for the Operating System on the Instance to allow a connection to be established on any desired or
available port. Option A is invalid because this rule alone is not enough. You also need to ensure incoming traffic on
ephemeral ports Option C is invalid because need to ensure incoming traffic on ephemeral ports and not only port 443
Options E and F are invalid since here you are allowing additional ports on Security groups which are not required For
more information on VPC Security Groups, please visit the below URL:
https://docs.aws.amazon.com/AmazonVPC/latest/UserGuideA/PC_SecurityGroups.htmll The correct answers are: A
network ACL with rules that allow outgoing traffic on port 443 and incoming traffic on ephemeral ports, A security group
with a rule that allows outgoing traffic on port 443


QUESTION 3
In response to the past DDoS attack experiences, a Security Engineer has set up an Amazon CloudFront distribution for
an Amazon S3 bucket. There is concern that some users may bypass the CloudFront distribution and access the S3
bucket directly.
What must be done to prevent users from accessing the S3 objects directly by using URLs?
A. Change the S3 bucket/object permission so that only the bucket owner has access.
B. Set up a CloudFront origin access identity (OAI), and change the S3 bucket/object permission so that only the OAI
has access.
C. Create IAM roles for CloudFront and change the S3 bucket/object permission so that only the IAM role has access.
D. Redirect S3 bucket access to the corresponding CloudFront distribution.
Correct Answer: B
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restrictingaccess-to-s3.html

QUESTION 4
An application has been built with Amazon EC2 instances that retrieve messages from Amazon SQS. Recently, IAM
changes were made and the instances can no longer retrieve messages.
What actions should be taken to troubleshoot the issue while maintaining the least privilege? (Select two.)
A. Configure and assign an MFA device to the role used by the instances.
B. Verify that the SQS resource policy does not explicitly deny access to the role used by the instances.
C. Verify that the access key attached to the role used by the instances is active.
D. Attach the AmazonSQSFullAccess managed policy to the role used by the instances.
E. Verify that the role attached to the instances contains policies that allow access to the queue.
Correct Answer: BE


QUESTION 5
A Security Engineer is implementing a solution to allow users to seamlessly encrypt Amazon S3 objects without having
to touch the keys directly. The solution must be highly scalable without requiring continual management. Additionally,
the organization must be able to immediately delete the encryption keys.
Which solution meets these requirements?
A. Use AWS KMS with AWS managed keys and the ScheduleKeyDeletion API with a PendingWindowInDays set to 0 to
remove the keys if necessary.
B. Use KMS with AWS imported key material and then use the DeletelmportedKeyMaterial API to remove the key
material if necessary.
C. Use AWS CloudHSM to store the keys and then use the CloudHSM API or the PKCS11 library to delete the keys if
necessary.
D. Use the Systems Manager Parameter Store to store the keys and then use the service API operations to delete the
key if necessary.
Correct Answer: C
https://docs.aws.amazon.com/kms/latest/developerguide/importing-keys-delete-key-material.html

QUESTION 6
After multiple compromises of its Amazon EC2 instances, a company\\’s Security Officer is mandating that memory
dumps of compromised instances be captured for further analysis. A Security Engineer just received an EC2 abuse
notification report from AWS stating that an EC2 instance running the most recent Windows Server 2019 Base AMI is
compromised.
How should the Security Engineer collect a memory dump of the EC2 instance for forensic analysis?
A. Give consent to the AWS Security team to dump the memory core on the compromised instance and provide it to
AWS Support for analysis.
B. Review memory dump data that the AWS Systems Manager Agent sent to Amazon CloudWatch Logs.
C. Download and run the EC2Rescue for Windows Server utility from AWS.
D. Reboot the EC2 Windows Server, enter safe mode, and select memory dump.
Correct Answer: A


QUESTION 7
A Security Architect has been asked to review existing security architecture and identify why the application servers
cannot successfully initiate a connection to the database servers. The following summary describes the architecture:
1 An Application Load Balancer, an internet gateway, and a NAT gateway are configured in the public subnet 2.
Database, application, and web servers are configured on three different private subnets.
3 The VPC has two route tables: one for the public subnet and one for all other subnets The route table for the public
subnet has a 0 0 0 0/0 route to the internet gateway The route table for all other subnets has a 0 0.0.0/0 route to the
NAT gateway. All private subnets can route to each other
4 Each subnet has a network ACL implemented that limits all inbound and outbound connectivity to only the required
ports and protocols
5 There are 3 Security Groups (SGs) database application and web Each group limits all inbound and outbound
connectivity to the minimum required
Which of the following accurately reflects the access control mechanisms the Architect should verify1?
A. Outbound SG configuration on database servers Inbound SG configuration on application servers inbound and
outbound network ACL configuration on the database subnet Inbound and outbound network ACL configuration on the
application server subnet
B. Inbound SG configuration on database servers Outbound SG configuration on application servers Inbound and
outbound network ACL configuration on the database subnet Inbound and outbound network ACL configuration on the
application server subnet
C. Inbound and outbound SG configuration on database servers Inbound and outbound SG configuration on application
servers Inbound network ACL configuration on the database subnet Outbound network ACL configuration on the
application server subnet
D. Inbound SG configuration on database servers Outbound SG configuration on application servers Inbound network
ACL configuration on the database subnet Outbound network ACL configuration on the application server subnet.
Correct Answer: A

QUESTION 8
A company became aware that one of its access keys was exposed on a code-sharing website 11 days ago. A Security
The engineer must review all use of the exposed access keys to determine the extent of the exposure. The company
enabled AWS CloudTrail m a regions when it opened the account
Which of the following will allow (Security Engineer 10 to complete the task?
A. Filter the event history on the exposed access key in the CloudTrail console Examine the data from the past 11
days.
B. Use the AWS CLI to generate an IAM credential report Extract all the data from the past 11 days.
C. Use Amazon Athena to query the CloudTrail logs from Amazon S3 Retrieve the rows for the exposed access key tor
the past 11 days.
D. Use the Access Advisor tab in the IAM console to view all of the access key activity for the past 11 days.
Correct Answer: C

QUESTION 9
A company is deploying a new web application on AWS. Based on their other web applications, they anticipate being
the target of frequent DDoS attacks. Which steps can the company use to protect its application? Select 2 answers
from the options given below.
Please select:
A. Associate the EC2 instances with a security group that blocks traffic from blacklisted IP addresses.
B. Use an ELB Application Load Balancer and Auto Scaling group to scale to absorb application-layer traffic.
C. Use Amazon Inspector on the EC2 instances to examine incoming traffic and discard malicious traffic.
D. Use CloudFront and AWS WAF to prevent malicious traffic from reaching the application
E. Enable GuardDuty to block malicious traffic from reaching the application
Correct Answer: BD
The below diagram from AWS shows the best-case scenario for avoiding DDoS attacks using services such as AWS
Cloudflare WAF, ELB, and Autoscaling

scs-c01 exam questions-q9

Option A is invalid because by default security groups don\\’t allow access Option C is invalid because AWS Inspector
cannot be used to examine traffic Option E is invalid because this can be used for attacks on EC2 Instances but not
against DDos attacks on the entire application For more information on DDoS mitigation from AWS, please visit the
below URL: https://aws.amazon.com/answers/networking/aws-ddos-attack-mitieationi The correct answers are: Use an
ELB Application Load Balancer and Auto Scaling group to scale to absorb application-layer traffic., Use CloudFront and
AWS WAF to prevent malicious traffic from reaching the application


QUESTION 10
You have an Ec2 Instance in a private subnet that needs to access the KMS service. Which of the following methods
can help fulfill this requirement, keeping security in perspective
Please select:
A. Use a VPC endpoint
B. Attach an Internet gateway to the subnet
C. Attach a VPN connection to the VPC
D. Use VPC Peering
Correct Answer: A
The AWS Documentation mentions the following You can connect directly to AWS KMS through a private endpoint in
your VPC instead of connecting over the internet. When you use a VPC endpoint communication between your VPC
and AWS KMS is conducted entirely within the AWS network. Option B is invalid because this could open threats from
the internet Option C is invalid because this is normally used for communication between on-premise environments and
AWS. Option D is invalid because this is normally used for communication between VPCs For more information on
accessing KMS via an endpoint, please visit the following URL
https://docs.aws.amazon.com/kms/latest/developerguide/kms-vpcendpoint.htmll The correct answer is: Use a VPC endpoint

QUESTION 11
A company has external vendors that must deliver files to the company. These vendors have cross-account that gives
their permission to upload objects to one of the company\\’s S3 buckets.
What combination of steps must the vendor follow to successfully deliver a file to the company? Select 2 answers from
the options are given below
Please select:
A. Attach an IAM role to the bucket that grants the bucket owner full permissions to the object
B. Add a grant to the objects ACL giving full permissions to the bucket owner.
C. Encrypt the object with a KMS key controlled by the company.
D. Add a bucket policy to the bucket that grants the bucket owner full permissions to the object
E. Upload the file to the company\\’s S3 bucket
Correct Answer: BE
This scenario is given in the AWS Documentation A bucket owner can enable other AWS accounts to upload objects.
These objects are owned by the accounts that created them. The bucket owner does not own objects that were not
created by the bucket owner. Therefore, for the bucket owner to grant access to these objects, the object owner must
first grant permission to the bucket owner to use an object ACL. The bucket owner can then delegate those permissions
via a bucket policy. In this example, the bucket owner delegates permission to users in its own account.

scs-c01 exam questions-q11

Options A and D are invalid because bucket ACL\\’s are used to give grants to bucket Option C is not required since
the encryption is not part of the requirement For more information on this scenario please see the below Link:
https://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroushs-manaeing-accessexample3.htmll The
correct answers are: Add a grant to the objects ACL giving full permissions to bucket owner., Upload the file to the
company\\’s S3 bucket


QUESTION 12
Your company has a set of EC2 Instances defined in AWS. These Ec2 Instances have strict security groups attached to
them. You need to ensure that changes to the Security groups are noted and acted on accordingly. How can you
achieve this?
Please select:
A. Use Cloudwatch logs to monitor the activity on the Security Groups. Use filters to search for the changes and use
SNS for the notification.
B. Use Cloudwatch metrics to monitor the activity on the Security Groups. Use filters to search for the changes and use
SNS for the notification.
C. Use AWS inspector to monitor the activity on the Security Groups. Use filters to search for the changes and use SNS
f the notification.
D. Use Cloudwatch events to be triggered for any changes to the Security Groups.Configure the Lambda function for
email notification as well.
Correct Answer: D
The below diagram from an AWS blog shows how security groups can be monitored Option A is invalid because you
need to use Cloudwatch Events to check for chan, Option B is invalid because you need to use Cloudwatch Events to
check for change Option C is invalid because AWS inspector is not used to monitoring the activity on Security Groups For
more information on monitoring security groups, please visit the below URL: Ihttpsy/aws.amazon.com/blogs/security/how-to-automatically-revert-and-receive-notifications-aboutchanges-to-your-amazonj \\’pc-security-groups/ The correct
answer is: Use Cloudwatch events to be triggered for any changes to the Security Groups. Configure the Lambda
function for email notification as well.

scs-c01 exam questions-q12

 

QUESTION 13
An external Auditor finds that a company\\’s user passwords have no minimum length. The company is currently using
two identity providers:
1.
AWS IAM federated with on-premises Active Directory
2.
Amazon Cognito user pools to accessing an AWS Cloud application developed by the company
Which combination o1 actions should the Security Engineer take to solve this issue? (Select TWO.)
A. Update the password length policy In the on-premises Active Directory configuration.
B. Update the password length policy In the IAM configuration.
C. Enforce an IAM policy In Amazon Cognito and AWS IAM with a minimum password length condition.
D. Update the password length policy in the Amazon Cognito configuration.
E. Create an SCP with AWS Organizations that enforces a minimum password length for AWS IAM and Amazon
Cognito.
Correct Answer: AC

Welcome to download the valid Pass4itsure SCS-C01 pdf

Free downloadGoogle Drive
Amazon AWS SCS-C01 pdf https://drive.google.com/file/d/1JRPXuxAvU2SKyppRM8NVWT0LSCp3gArr/view?usp=sharing

Pass4itsure latest Amazon exam dumps coupon code free share

Summary:

New Amazon SCS-C01 exam questions from Pass4itsure SCS-C01 dumps! Welcome to download the newest Pass4itsure SCS-C01 dumps https://www.pass4itsure.com/aws-certified-security-specialty.html (487 Q&As), verified the latest SCS-C01 practice test questions with relevant answers.

Amazon AWS SCS-C01 dumps pdf free share https://drive.google.com/file/d/1JRPXuxAvU2SKyppRM8NVWT0LSCp3gArr/view?usp=sharing

Latest Amazon Exam Dumps

Exam Name Free Online practice test Free PDF Dumps Premium Exam Dumps
AWS Certified Professional
AWS Certified DevOps Engineer – Professional (DOP-C01) Free DOP-C01 practice test (Online) Free DOP-C01 PDF Dumps (Download) pass4itsure DOP-C01 Exam Dumps (Premium)
AWS Certified Solutions Architect – Professional (SAP-C01) Free SAP-C01 practice test (Online) Free SAP-C01 PDF Dumps (Download) pass4itsure SAP-C01 Exam Dumps (Premium)
AWS Certified Associate
AWS Certified Developer – Associate (DVA-C01) Free DVA-C01 practice test (Online) Free DVA-C01 PDF Dumps (Download) pass4itsure DVA-C01 Exam Dumps (Premium)
AWS Certified Solutions Architect – Associate (SAA-C01) Free SAA-C01 practice test (Online) Free SAA-C01 PDF Dumps (Download) pass4itsure SAA-C01 Exam Dumps (Premium)
AWS Certified Solutions Architect – Associate (SAA-C02) Free SAA-C02 practice test (Online) Free SAA-C02 PDF Dumps (Download) pass4itsure SAA-C02 Exam Dumps (Premium)
AWS Certified SysOps Administrator – Associate (SOA-C01) Free SOA-C01 practice test (Online) Free SOA-C01 PDF Dumps (Download) pass4itsure SOA-C01 Exam Dumps (Premium)
AWS Certified Foundational
AWS Certified Cloud Practitioner (CLF-C01) Free CLF-C01 practice test (Online) Free CLF-C01 PDF Dumps (Download) pass4itsure CLF-C01 Exam Dumps (Premium)
AWS Certified Specialty
AWS Certified Advanced Networking – Specialty (ANS-C00) Free ANS-C00 practice test (Online) Free ANS-C00 PDF Dumps (Download) pass4itsure ANS-C00 Exam Dumps (Premium)
AWS Certified Database – Specialty (DBS-C01) Free DBS-C01 practice test (Online) Free DBS-C01 PDF Dumps (Download) pass4itsure DBS-C01 Exam Dumps (Premium)
AWS Certified Alexa Skill Builder – Specialty (AXS-C01) Free AXS-C01 practice test (Online) Free AXS-C01 PDF Dumps (Download) pass4itsure AXS-C01 Exam Dumps (Premium)
AWS Certified Big Data – Speciality (BDS-C00) Free BDS-C00 practice test (Online) Free BDS-C00 PDF Dumps (Download) pass4itsure BDS-C00 Exam Dumps (Premium)
AWS Certified Machine Learning – Specialty (MLS-C01) Free MLS-C01 practice test (Online) Free MLS-C01 PDF Dumps (Download) pass4itsure MLS-C01 Exam Dumps (Premium)
AWS Certified Security – Specialty (SCS-C01) Free SCS-C01 practice test (Online) Free SCS-C01 PDF Dumps (Download) pass4itsure SCS-C01 Exam Dumps (Premium)