This page was exported from Valid Premium Exam [ http://premium.validexam.com ] Export date:Thu Sep 19 22:35:41 2024 / +0000 GMT ___________________________________________________ Title: Professional-Cloud-Developer Self-Study Guide for Becoming an Google Certified Professional - Cloud Developer Expert [Q64-Q79] --------------------------------------------------- Professional-Cloud-Developer Self-Study Guide for Becoming an Google Certified Professional - Cloud Developer Expert Professional-Cloud-Developer Study Guide Realistic Verified Professional-Cloud-Developer Dumps Q64. Your data is stored in Cloud Storage buckets. Fellow developers have reported that data downloaded from Cloud Storage is resulting in slow API performance. You want to research the issue to provide details to the GCP support team.Which command should you run?  gsutil test -o output.json gs://my-bucket  gsutil perfdiag -o output.json gs:/  gcloud compute scp example-instance:~/test-data -o output.json gs://my-bucket  gcloud services test -o output.json gs://my-bucket Explanation/Reference: https://groups.google.com/forum/#!topic/gce-discussion/xBl9Jq5HDsYQ65. Case StudyCompany OverviewHipLocal is a community application designed to facilitate communication between people in close proximity. It is used for event planning and organizing sporting events, and for businesses to connect with their local communities. HipLocal launched recently in a few neighborhoods in Dallas and is rapidly growing into a global phenomenon. Its unique style of hyper-local community communication and business outreach is in demand around the world.Executive StatementWe are the number one local community app; it’s time to take our local community services global. Our venture capital investors want to see rapid growth and the same great experience for new local and virtual communities that come online, whether their members are 10 or 10000 miles away from each other.Solution ConceptHipLocal wants to expand their existing service, with updated functionality, in new regions to better serve their global customers. They want to hire and train a new team to support these regions in their time zones. They will need to ensure that the application scales smoothly and provides clear uptime data.Existing Technical EnvironmentHipLocal’s environment is a mix of on-premises hardware and infrastructure running in Google Cloud Platform.The HipLocal team understands their application well, but has limited experience in global scale applications.Their existing technical environment is as follows:* Existing APIs run on Compute Engine virtual machine instances hosted in GCP.* State is stored in a single instance MySQL database in GCP.* Data is exported to an on-premises Teradata/Vertica data warehouse.* Data analytics is performed in an on-premises Hadoop environment.* The application has no logging.* There are basic indicators of uptime; alerts are frequently fired when the APIs are unresponsive.Business RequirementsHipLocal’s investors want to expand their footprint and support the increase in demand they are seeing. Their requirements are:* Expand availability of the application to new regions.* Increase the number of concurrent users that can be supported.* Ensure a consistent experience for users when they travel to different regions.* Obtain user activity metrics to better understand how to monetize their product.* Ensure compliance with regulations in the new regions (for example, GDPR).* Reduce infrastructure management time and cost.* Adopt the Google-recommended practices for cloud computing.Technical Requirements* The application and backend must provide usage metrics and monitoring.* APIs require strong authentication and authorization.* Logging must be increased, and data should be stored in a cloud analytics platform.* Move to serverless architecture to facilitate elastic scaling.* Provide authorized access to internal apps in a secure manner.In order to meet their business requirements, how should HipLocal store their application state?  Use local SSDs to store state.  Put a memcache layer in front of MySQL.  Move the state storage to Cloud Spanner.  Replace the MySQL instance with Cloud SQL. Q66. Your code is running on Cloud Functions in project A. It is supposed to write an object in a Cloud Storage bucket owned by project B. However, the write call is failing with the error “403 Forbidden”.What should you do to correct the problem?  Grant your user account the roles/storage.objectCreator role for the Cloud Storage bucket.  Grant your user account the roles/iam.serviceAccountUser role for the service-PROJECTA@gcf-admin- robot.iam.gserviceaccount.com service account.  Grant the service-PROJECTA@gcf-admin-robot.iam.gserviceaccount.com service account the roles/ storage.objectCreator role for the Cloud Storage bucket.  Enable the Cloud Storage API in project B. Q67. Your data is stored in Cloud Storage buckets. Fellow developers have reported that data downloaded from Cloud Storage is resulting in slow API performance. You want to research the issue to provide details to the GCP support team. Which command should you run?  gsutil test -o output.json gs://my-bucket  gsutil perfdiag -o output.json gs://my-bucket  gcloud compute scp example-instance:~/test-data -o output.json gs://my-bucket  gcloud services test -o output.json gs://my-bucket Q68. You are planning to migrate a MySQL database to the managed Cloud SQL database for Google Cloud. You have Compute Engine virtual machine instances that will connect with this Cloud SQL instance. You do not want to whitelist IPs for the Compute Engine instances to be able to access Cloud SQL.What should you do?  Enable private IP for the Cloud SQL instance.  Whitelist a project to access Cloud SQL, and add Compute Engine instances in the whitelisted project.  Create a role in Cloud SQL that allows access to the database from external instances, and assign the Compute Engine instances to that role.  Create a CloudSQL instance on one project. Create Compute engine instances in a different project.Create a VPN between these two projects to allow internal access to CloudSQL. Explanation/Reference: https://cloud.google.com/sql/docs/mysql/connect-external-appQ69. You have two tables in an ANSI-SQL compliant database with identical columns that you need to quickly combine into a single table, removing duplicate rows from the result set.What should you do?  Use the JOIN operator in SQL to combine the tables.  Use nested WITH statements to combine the tables.  Use the UNION operator in SQL to combine the tables.  Use the UNION ALL operator in SQL to combine the tables. Reference:https://www.techonthenet.com/sql/union_all.phpQ70. Your application is logging to Stackdriver. You want to get the count of all requests on all /api/alpha/* endpoints.What should you do?  Add a Stackdriver counter metric for path:/api/alpha/.  Add a Stackdriver counter metric for endpoint:/api/alpha/*.  Export the logs to Cloud Storage and count lines matching /api/alpha.  Export the logs to Cloud Pub/Sub and count lines matching /api/alpha. Explanation/Reference:Q71. Which of the following statements empathize with the customer or helps resolve a conflict? (Choose three.)  “Calm down. You are being ridiculous.”  “These devices are expensive, I cannot change that.”  “Please control yourself. That is not how it works.”  “I’m sorry to hear that.”  “I can understand why you are upset.”  “It is not possible to repair it by Monday.”  “You are right. I would be frustrated as well.” Q72. You are load testing your server application. During the first 30 seconds, you observe that a previously inactive Cloud Storage bucket is now servicing 2000 write requests per second and 7500 read requests per second.Your application is now receiving intermittent 5xx and 429 HTTP responses from the Cloud Storage JSON API as the demand escalates. You want to decrease the failed responses from the Cloud Storage API.What should you do?  Distribute the uploads across a large number of individual storage buckets.  Use the XML API instead of the JSON API for interfacing with Cloud Storage.  Pass the HTTP response codes back to clients that are invoking the uploads from your application.  Limit the upload rate from your application clients so that the dormant bucket’s peak request rate is reached more gradually. Reference:https://cloud.google.com/storage/docs/request-rateQ73. Your application is built as a custom machine image. You have multiple unique deployments of the machine image. Each deployment is a separate managed instance group with its own template. Each deployment requires a unique set of configuration values. You want to provide these unique values to each deployment but use the same custom machine image in all deployments. You want to use out-of-the-box features of Compute Engine. What should you do?  Place the unique configuration values in the persistent disk.  Place the unique configuration values in a Cloud Bigtable table.  Place the unique configuration values in the instance template startup script.  Place the unique configuration values in the instance template instance metadata. Q74. You have containerized a legacy application that stores its configuration on an NFS share. You need to deploy this application to Google Kubernetes Engine (GKE) and do not want the application serving traffic until after the configuration has been retrieved. What should you do?  Use the gsutil utility to copy files from within the Docker container at startup, and start the service using an ENTRYPOINT script.  Create a PersistentVolumeClaim on the GKE cluster. Access the configuration files from the volume, and start the service using an ENTRYPOINT script.  Use the COPY statement in the Dockerfile to load the configuration into the container image. Verify that the configuration is available, and start the service using an ENTRYPOINT script.  Add a startup script to the GKE instance group to mount the NFS share at node startup. Copy the configuration files into the container, and start the service using an ENTRYPOINT script. Q75. You migrated your applications to Google Cloud Platform and kept your existing monitoring platform. You now find that your notification system is too slow for time critical problems.What should you do?  Replace your entire monitoring platform with Stackdriver.  Install the Stackdriver agents on your Compute Engine instances.  Use Stackdriver to capture and alert on logs, then ship them to your existing platform.  Migrate some traffic back to your old platform and perform AB testing on the two platforms concurrently. Reference:https://cloud.google.com/monitoring/Q76. Your company’s development teams want to use Cloud Build in their projects to build and push Docker images to Container Registry. The operations team requires all Docker images to be published to a centralized, securely managed Docker registry that the operations team manages.What should you do?  Use Container Registry to create a registry in each development team’s project. Configure the Cloud Build build to push the Docker image to the project’s registry. Grant the operations team access to each development team’s registry.  Create a separate project for the operations team that has Container Registry configured. Assign appropriate permissions to the Cloud Build service account in each developer team’s project to allow access to the operation team’s registry.  Create a separate project for the operations team that has Container Registry configured. Create a Service Account for each development team and assign the appropriate permissions to allow it access to the operations team’s registry. Store the service account key file in the source code repository and use it to authenticate against the operations team’s registry.  Create a separate project for the operations team that has the open source Docker Registry deployed on a Compute Engine virtual machine instance. Create a username and password for each development team.Store the username and password in the source code repository and use it to authenticate against the operations team’s Docker registry. Explanation/Reference: https://cloud.google.com/container-registry/Q77. You are developing an internal application that will allow employees to organize community events within your company. You deployed your application on a single Compute Engine instance. Your company uses Google Workspace (formerly G Suite), and you need to ensure that the company employees can authenticate to the application from anywhere. What should you do?  Add a public IP address to your instance, and restrict access to the instance using firewall rules. Allow your company’s proxy as the only source IP address.  Add an HTTP(S) load balancer in front of the instance, and set up Identity-Aware Proxy (IAP). Configure the IAP settings to allow your company domain to access the website.  Set up a VPN tunnel between your company network and your instance’s VPC location on Google Cloud. Configure the required firewall rules and routing information to both the on-premises and Google Cloud networks.  Add a public IP address to your instance, and allow traffic from the internet. Generate a random hash, and create a subdomain that includes this hash and points to your instance. Distribute this DNS address to your company’s employees. https://cloud.google.com/blog/topics/developers-practitioners/control-access-your-web-sites-identity-aware-proxyQ78. Your application is running in multiple Google Kubernetes Engine clusters. It is managed by a Deployment in each cluster. The Deployment has created multiple replicas of your Pod in each cluster. You want to view the logs sent to stdout for all of the replicas in your Deployment in all clusters. Which command should you use?  kubectl logs [PARAM]  gcloud logging read [PARAM]  kubectl exec -it [PARAM] journalctl  gcloud compute ssh [PARAM] –command= “sudo journalctl” Q79. You have written a Cloud Function that accesses other Google Cloud resources. You want to secure the environment using the principle of least privilege. What should you do?  Create a new service account that has Editor authority to access the resources. The deployer is given permission to get the access token.  Create a new service account that has a custom IAM role to access the resources. The deployer is given permission to get the access token.  Create a new service account that has Editor authority to access the resources. The deployer is given permission to act as the new service account.  Create a new service account that has a custom IAM role to access the resources. The deployer is given permission to act as the new service account.  Loading … How to study the Google Professional Cloud Developer Exam Preparation of certification exams could be covered with two resource types . The first one are the study guides, reference books and study forums that are elaborated and appropriate for building information from ground up. Apart from them video tutorials and lectures are a good option to ease the pain of through study and are relatively make the study process more interesting nonetheless these demand time and concentration from the learner. Smart candidates who wish to create a solid foundation altogether examination topics and connected technologies typically mix video lectures with study guides to reap the advantages of each but practice exams or practice exam engines is one important study tool which goes typically unnoted by most candidates. Practice exams are designed with our experts to make exam prospects test their knowledge on skills attained in course, as well as prospects become comfortable and familiar with the real exam environment.Statistics have indicated exam anxiety plays much bigger role of students failure in exam than the fear of the unknown. ValidExam expert team recommends preparing some notes on these topics along with it don't forget to practice Google Professional Cloud Developer Exam exam dumps which had been written by our expert team, each of these can assist you loads to clear this exam with excellent marks.   Valid Professional-Cloud-Developer Exam Dumps Ensure you a HIGH SCORE: https://www.validexam.com/Professional-Cloud-Developer-latest-dumps.html --------------------------------------------------- Images: https://premium.validexam.com/wp-content/plugins/watu/loading.gif https://premium.validexam.com/wp-content/plugins/watu/loading.gif --------------------------------------------------- --------------------------------------------------- Post date: 2022-07-25 09:46:32 Post date GMT: 2022-07-25 09:46:32 Post modified date: 2022-07-25 09:46:32 Post modified date GMT: 2022-07-25 09:46:32