Published 12/2022MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHzLanguage: English | Size: 1.37 GB | Duration: 3h 28m
Pass Google Cloud Certified Professional Data Eeer Exam 2023 What you'll learn Designing data processing systems Building and operationalizing data processing systems Operationalizing machine learning models Ensuring solution quality Designing data pipelines Designing a data processing solution Migrating data warehousing and data processing Building and operationalizing storage systems Building and operationalizing pipelines Building and operationalizing processing infrastructure Leveraging pre-built ML models as a service Deploying an ML pipeline Measuring, monitoring, and troubleshooting machine learning models Designing for security and compliance Ensuring scalability and efficiency Ensuring reliability and fidelity Ensuring flexibility and portability Requirements Everything that you need in order to pass Google Cloud Certified Professional Data Eeer will be covered in this course Description Designing data processing systemsSelecting the appropriate storage technologies. Considerations include:● Mapping storage systems to business requirements● Data modeling● Trade-offs involving latency, throughput, transactions● Distributed systems● Schema designDesigning data pipelines. Considerations include:● Data publishing and visualization (e.g., BigQuery)● Batch and streaming data (e.g., Dataflow, Dataproc, Apache Beam, Apache Spark and Hadoop ecosystem, Pub/Sub, Apache Kafka)● Online (interactive) vs. batch predictions● Job automation and orchestration (e.g., Cloud Composer)Designing a data processing solution. Considerations include:● Choice of infrastructure● System availability and fault tolerance● Use of distributed systems● Capacity planning● Hybrid cloud and edge computing● Architecture options (e.g., message brokers, message queues, middleware, service-oriented architecture, serverless functions)● At least once, in-order, and exactly once, etc., event processingMigrating data warehousing and data processing. Considerations include:● Awareness of current state and how to migrate a design to a future state● Migrating from on-premises to cloud (Data Transfer Service, Transfer Appliance, Cloud Networking)● Validating a migrationBuilding and operationalizing data processing systemsBuilding and operationalizing storage systems. Considerations include:● Effective use of managed services (Cloud Bigtable, Cloud Spanner, Cloud SQL, BigQuery, Cloud Storage, Datastore, Memorystore)● Storage costs and performance● Life cycle management of dataBuilding and operationalizing pipelines. Considerations include:● Data cleansing● Batch and streaming● Transformation● Data acquisition and import● Integrating with new data sourcesBuilding and operationalizing processing infrastructure. Considerations include:● Provisioning resources● Monitoring pipelines● Adjusting pipelines● Testing and quality controlOperationalizing machine learning modelsLeveraging pre-built ML models as a service. Considerations include:● ML APIs (e.g., Vision API, Speech API)● Customizing ML APIs (e.g., AutoML Vision, Auto ML text)● Conversational experiences (e.g., Dialogflow)Deploying an ML pipeline. Considerations include:● Ingesting appropriate data● Retraining of machine learning models (AI Platform Prediction and Training, BigQuery ML, Kubeflow, Spark ML)● Continuous evaluationChoosing the appropriate training and serving infrastructure. Considerations include:● Distributed vs. single machine● Use of edge compute● Hardware accelerators (e.g., GPU, TPU)Measuring, monitoring, and troubleshooting machine learning models. Considerations include:● Machine learning teology (e.g., features, labels, models, regression, classification, recommendation, supervised and unsupervised learning, evaluation metrics)● Impact of dependencies of machine learning models● Common sources of error (e.g., assumptions about data)Ensuring solution qualityDesigning for security and compliance. Considerations include:● Identity and access management (e.g., Cloud IAM)● Data security (encryption, key management)● Ensuring privacy (e.g., Data Loss Prevention API)● Legal compliance (e.g., Health Insurance Portability and Accountability Act (HIPAA), Children's Online Privacy Protection Act (COPPA), FedRAMP, General Data Protection Regulation (GDPR))Ensuring scalability and efficiency. Considerations include:● Building and running test suites● Pipeline monitoring (e.g., Cloud Monitoring)● Assessing, troubleshooting, and improving data representations and data processing infrastructure● Resizing and autoscaling resourcesEnsuring reliability and fidelity. Considerations include:● Perfog data preparation and quality control (e.g., Dataprep)● Verification and monitoring● Planning, executing, and stress testing data recovery (fault tolerance, rerunning failed jobs, perfog retrospective re-analysis)● Choosing between ACID, idempotent, eventually consistent requirementsEnsuring flexibility and portability. Considerations include:● Mapping to current and future business requirements● Designing for data and application portability (e.g., multicloud, data residency requirements)● Data staging, cataloging, and discovery Overview Section 1: Choosing the RIght Product Lecture 1 Choosing the Right Product Section 2: Google Cloud Storage Lecture 2 Google Cloud Storage Section 3: Cloud SQL Lecture 3 Cloud SQL Section 4: Cloud Dataflow Lecture 4 Dataflow - Part 1 Lecture 5 Dataflow Lab Section 5: Cloud Dataproc Lecture 6 Cloud Dataproc Section 6: Cloud Pub/Sub Lecture 7 Cloud Pub/Sub Section 7: Cloud BigQuery Lecture 8 BigQuery - Part 1 Lecture 9 BigQuery Views Section 8: Cloud BigTable Lecture 10 BigTable - Part 1 Section 9: Cloud Composer Lecture 11 Cloud Composer Section 10: Cloud Firestore Lecture 12 Introduction Section 11: Data Studio Lecture 13 Introduction Section 12: Cloud DataPrep Lecture 14 Introduction Section 13: Practice Questions & Answers Lecture 15 Part 1 Lecture 16 Part 2 Lecture 17 Part 3 Lecture 18 Part 4 Lecture 19 Part 5 Lecture 20 Part 6 Lecture 21 Part 7 Lecture 22 Part 8 Lecture 23 Part 9 Lecture 24 Part 10 Lecture 25 Part 11 Bner,Intermediate,Advanced HomePage:
TO MAC USERS: If RAR password doesn't work, use this archive program:
RAR Expander 0.8.5 Beta 4 and extract password protected files without error.
TO WIN USERS: If RAR password doesn't work, use this archive program:
Latest Winrar and extract password protected files without error.