-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Course - Google Cloud Computing Foundations: Infrastructure in Google Cloud #471
Comments
Lesson 4 Lesson 5 What protocol is used by REST APIs?
Cloud Pub/Sub is not a messaging processing service. You write your applications to process the messages stored in Cloud Pub/Sub. Which of the following API Management Systems can be used on legacy systems?
Lesson 6 project level roles Google Cloud's Identity and Access Management (IAM) service lets you create and manage permissions for Google Cloud resources. Cloud IAM unifies access control for Google Cloud services into a single system and provides a consistent set of operations. In this hands-on lab you learn how to assign a role to a second user and remove assigned roles associated with Cloud IAM. More specifically, you sign in with 2 different sets of credentials to experience how granting and revoking permissions works from Google Cloud Project Owner and Viewer roles. You should see Browser, Editor, Owner, and Viewer roles. These four are known as primitive roles in Google Cloud. Primitive roles set project-level permissions and unless otherwise specified, they control access and management to all Google Cloud services.
Quiz How are user identities created in Cloud IAM? Which of the following is not an encryption option for Google Cloud? If a Cloud IAM policy gives you Owner permissions at the project level, your access to a resource in the project may be restricted by a more restrictive policy on that resource. False. |
Perform Foundational Infrastructure Tasks in Google CloudChallenge ScenarioYou are just starting your junior cloud engineer Your challengeYou are now asked to help a newly formed development Some Jooli Inc. standards you should follow:
// Not sure we need to create this project and set gcloud config set compute/zone us-east1-b
// Probably not needed
gsutil mb gs://a-thumbnail-bucket /* globals exports, require */
//jshint strict: false
//jshint esversion: 6
"use strict";
const crc32 = require("fast-crc32c");
const gcs = require("@google-cloud/storage")();
const PubSub = require("@google-cloud/pubsub");
const imagemagick = require("imagemagick-stream");
exports.thumbnail = (event, context) => {
const fileName = event.name;
const bucketName = event.bucket;
const size = "64x64"
const bucket = gcs.bucket(bucketName);
const topicName = "a-thumbnail";
const pubsub = new PubSub();
if ( fileName.search("64x64_thumbnail") == -1 ){
// doesn't have a thumbnail, get the filename extension
var filename_split = fileName.split('.');
var filename_ext = filename_split[filename_split.length - 1];
var filename_without_ext = fileName.substring(0, fileName.length - filename_ext.length );
if (filename_ext.toLowerCase() == 'png' || filename_ext.toLowerCase() == 'jpg'){
// only support png and jpg at this point
console.log(`Processing Original: gs://${bucketName}/${fileName}`);
const gcsObject = bucket.file(fileName);
let newFilename = filename_without_ext + size + '_thumbnail.' + filename_ext;
let gcsNewObject = bucket.file(newFilename);
let srcStream = gcsObject.createReadStream();
let dstStream = gcsNewObject.createWriteStream();
let resize = imagemagick().resize(size).quality(90);
srcStream.pipe(resize).pipe(dstStream);
return new Promise((resolve, reject) => {
dstStream
.on("error", (err) => {
console.log(`Error: ${err}`);
reject(err);
})
.on("finish", () => {
console.log(`Success: ${fileName} → ${newFilename}`);
// set the content-type
gcsNewObject.setMetadata(
{
contentType: 'image/'+ filename_ext.toLowerCase()
}, function(err, apiResponse) {});
pubsub
.topic(topicName)
.publisher()
.publish(Buffer.from(newFilename))
.then(messageId => {
console.log(`Message ${messageId} published.`);
})
.catch(err => {
console.error('ERROR:', err);
});
});
});
}
else {
console.log(`gs://${bucketName}/${fileName} is not an image I can handle`);
}
}
else {
console.log(`gs://${bucketName}/${fileName} already has a thumbnail`);
}
}; gcloud functions deploy thumbnail
Review Incomplete Taks In The QuestThis is the final task of the quest: So there are other tasks which were incomplete. Cloud StorageCloud Storage allows world-wide storage and retrieval of any amount of data at any time. You can use Cloud Storage for a range of scenarios including serving website content, storing data for archival and disaster recovery, or distributing large data objects to users via direct download. In this hands-on lab you will learn how to use the Cloud Console to create a storage bucket, then upload objects, create folders and subfolders, and make those objects publicly accessible. Create a Bucket Bucket naming rules: Each bucket has a default storage class, which you can specify when you create your bucket. gsutil mb gs://YOUR-BUCKET-NAME/ gsutil -m cp -r gs://spls/gsp067/python-docs-samples . : Copy sample app Cloud IAMGoogle Cloud's Identity and Access Management (IAM) service lets you create and manage permissions for Google Cloud resources. Cloud IAM unifies access control for Google Cloud services into a single system and provides a consistent set of operations. In this hands-on lab you learn how to assign a role to a second user and remove assigned roles associated with Cloud IAM. More specifically, you sign in with 2 different sets of credentials to experience how granting and revoking permissions works from Google Cloud Project Owner and Viewer roles. https://google.qwiklabs.com/focuses/551 Cloud IAM: Qwik Start Navigation menu > IAM & Admin > IAM Cloud MonitoringCloud Monitoring provides visibility into the performance, uptime, and overall health of cloud-powered applications. Cloud Monitoring collects metrics, events, and metadata from Google Cloud, Amazon Web Services, hosted uptime probes, application instrumentation, and a variety of common application components including Cassandra, Nginx, Apache Web Server, Elasticsearch, and many others. Cloud Monitoring ingests that data and generates insights via dashboards, charts, and alerts. Cloud Monitoring alerting helps you collaborate by integrating with Slack, PagerDuty, HipChat, Campfire, and more. Create a VM. Monitor a Compute Engine virtual machine (VM) instance with Cloud Monitoring. You'll also install monitoring and logging agents for your VM which collects more information from your instance, which could include metrics and logs from 3rd party apps. Agents collect data and then send or stream info to Cloud Monitoring in the Cloud Console. The Cloud Monitoring agent is a collectd-based daemon that gathers system and application metrics from virtual machine instances and sends them to Monitoring. By default, the Monitoring agent collects disk, CPU, network, and process metrics. Configuring the Monitoring agent allows third-party applications to get the full list of agent metrics. It is best practice to run the Cloud Logging agent on all your VM instances. curl -sSO https://dl.google.com/cloudagents/add-monitoring-agent-repo.sh curl -sSO https://dl.google.com/cloudagents/add-logging-agent-repo.sh Create an uptime check Create an alerting policy Create a dashboard and chart Pub/SubThe Pub/Sub basics
To sum it up, a producer publishes messages to a topic and a consumer creates a subscription to a topic to receive messages from it. gcloud pubsub topics create myTopic gcloud pubsub subscriptions create --topic myTopic mySubscription gcloud pubsub topics publish myTopic --message "Hello" gcloud pubsub subscriptions pull mySubscription --auto-ack You published 4 messages to your topic, but only 1 was outputted.
gcloud pubsub topics publish myTopic --message "Publisher is starting to get the hang of Pub/Sub" Add a flag to your command so you can output all three messages in one request. You may have not noticed, but you have actually been using a flag this entire time: the --auto-ack part of the pull command is a flag that has been formatting your messages into the neat boxes that you see your pulled messages in. limit is another flag that sets an upper limit on the number of messages to pull. Cloud FunctionsCloud Functions is a serverless execution environment for building and connecting cloud services. With Cloud Functions you write simple, single-purpose functions that are attached to events emitted from your cloud infrastructure and services. Your Cloud Function is triggered when an event being watched is fired. Your code executes in a fully managed environment. There is no need to provision any infrastructure or worry about managing any servers. Cloud Functions can be written in Node.js, Python, and Go, and are executed in language-specific runtimes as well. You can take your Cloud Function and run it in any standard Node.js runtime which makes both portability and local testing a breeze. Cloud Functions provides a connective layer of logic that lets you write code to connect and extend cloud services. Listen and respond to a file upload to Cloud Storage, a log change, or an incoming message on a Cloud Pub/Sub topic. Cloud Functions augments existing cloud services and allows you to address an increasing number of use cases with arbitrary programming logic. Cloud Functions have access to the Google Service Account credential and are thus seamlessly authenticated with the majority of Google Cloud services such as Datastore, Cloud Spanner, Cloud Translation API, Cloud Vision API, as well as many others. In addition, Cloud Functions are supported by numerous Node.js client libraries, which further simplify these integrations. Cloud events are things that happen in your cloud environment.These might be things like changes to data in a database, files added to a storage system, or a new virtual machine instance being created. Events occur whether or not you choose to respond to them. You create a response to an event with a trigger. A trigger is a declaration that you are interested in a certain event or set of events. Binding a function to a trigger allows you to capture and act on events. For more information on creating triggers and associating them with your functions, see Events and Triggers. Cloud Functions removes the work of managing servers, configuring software, updating frameworks, and patching operating systems. The software and infrastructure are fully managed by Google so that you just add code. Furthermore, provisioning of resources happens automatically in response to events. This means that a function can scale from a few invocations a day to many millions of invocations without any work from you.
https://cloud.google.com/sdk/gcloud/reference/functions/event-types/list mkdir gcf_hello_world /**
* Background Cloud Function to be triggered by Pub/Sub.
* This function is exported by index.js, and executed when
* the trigger topic receives a message.
*
* @param {object} data The event payload.
* @param {object} context The event metadata.
*/
exports.helloWorld = (data, context) => {
const pubSubMessage = data;
const name = pubSubMessage.data
? Buffer.from(pubSubMessage.data, 'base64').toString() : "Hello World";
console.log(`My Cloud Function: ${name}`);
}; gsutil mb -p [PROJECT_ID] gs://[BUCKET_NAME] When deploying a new function, you must specify --trigger-topic, --trigger-bucket, or --trigger-http. When deploying an update to an existing function, the function keeps the existing trigger unless otherwise specified. For this lab, you'll set the --trigger-topic as hello_world. gcloud functions deploy thumbnail --stage-bucket labb1 --trigger-topic labb1 --runtime nodejs10 --region us-east1 --stage-bucket=STAGE_BUCKET gcloud functions describe thumbnail https://cloud.google.com/sdk/gcloud/reference/functions/deploy#--stage-bucket |
Course - Google Cloud Computing Foundations: Infrastructure in Google Cloud
https://google.qwiklabs.com/course_templates/154
https://google.qwiklabs.com/course_sessions/116497
Quest - Perform Foundational Infrastructure Tasks in Google Cloud
https://google.qwiklabs.com/quests/118
Lessons - 4,5,6
How are user identities created in Cloud IAM?
Answer: User identities are created outside of Google Cloud using a Google-administered domain.
Reason: Creating users and groups within Google Cloud is not possible.
Which of the following is not an encryption option for Google Cloud?
Scripted encryption keys is not an option with Google Cloud.
If a Cloud IAM policy gives you Owner permissions at the project level, your access to a resource in the project may be restricted by a more restrictive policy on that resource. False.
Policies are a union of the parent and the resource. If a parent policy is less restrictive, it overrides a more restrictive resource policy.
The text was updated successfully, but these errors were encountered: