1. You need to develop procedures to verify resilience of disaster recovery for remote recovery using GCP. Your production environment is hosted onpremises. You need to establish a secure, redundant connection between your on-premises network and the GCP network. What should you do?
Verify that Dedicated Interconnect can replicate files to GCP. Verify that Cloud VPN can establish a secure connection between your networks if Dedicated Interconnect fails."
2. Your company operates nationally and plans to use GCP for multiple batch workloads, including some that are not time-critical. You also need to use GCP services that are HIPAA-certified and manage service costs. How should you design to meet Google best practices?
Provisioning preemptible VMs to reduce cost. Disable and then discontinue use of all GCP and APIs that are not HIPAA-compliant.
3. Your customer wants to do resilience testing of their authentication layer. This consists of a regional managed instance group serving a public REST API that reads from and writes to a Cloud SQL instance. What should you do?
Schedule a disaster simulation exercise during which you can shut off all VMs in a zone to see how your application behaves.
4. Your BigQuery project has several users. For audit purposes, you need to see how many queries each user ran in the last month. What should you do?
Use ‘bq show’ to list all jobs. Per job, use ‘bq Is’ to list job information and get the required information.
5. You want to automate the creation of a managed instance group. The VMs have many OS package dependencies. You want to minimize the startup time for new VMs in the instance group. What should you do?
Create a custom VM image with all OS package dependencies. Use Deployment Manager to create the managed instance group with the VM image.
6. Your company captures all web traffic data in Google Analytics 360 and stores it in BigQuery. Each country has its own dataset. Each dataset has multiple tables. You want analysts from each country to be able to see and query only the data for their respective countries. How should you configure the access rights?
Create a group per country. Add analysts to their respective country-groups. Create a single group ‘all_analysts’, and add all country-groups as members. Grant the ‘all-analysts’ group the IAM role of BigQuery jobUser. Share the appropriate dataset with view access with each respective analyst country-group.
7. You have been engaged by your client to lead the migration of their application infrastructure to GCP. One of their current problems is that the onpremises high performance SAN is requiring frequent and expensive upgrades to keep up with the variety of workloads that are identified as follows: 20TB of log archives retained for legal reasons; 500 GB of VM boot/data volumes and templates; 500 GB of image thumbnails; 200 GB of customer session state data that allows customers to restart sessions even if off-line for several days. Which of the following best reflects your recommendations for a cost-effective storage allocation?
Memcache backed by Persistent Disk SSD storage for customer session state data. Assorted local SSD-backed instances for VM boot/data volumes. Cloud Storage for log archives and thumbnails.
8. Your web application uses Google Kubernetes Engine to manage several workloads. One workload requires a consistent set of hostnames even after pod scaling and relaunches. Which feature of Kubernetes should you use to accomplish this?
9. You are using Cloud CDN to deliver static HTTP(S) website content hosted on a Compute Engine instance group. You want to improve the cache hit ratio. What should you do?
Customize the cache keys to omit the protocol from the key.
10. Your architecture calls for the centralized collection of all admin activity and VM system logs within your project. How should you collect these logs from both VMs and services?
Stackdriver automatically collects admin activity logs for most services. The Stackdriver Logging agent must be installed on each instance to collect system logs.