Job Description
� Designing Data Architecture: ? Analyze data requirements and business needs to design high-level and detailed data architecture plans. ? Select appropriate data storage solutions like BigQuery, Cloud Storage, and Cloud SQL based on data needs (structured, unstructured, etc.). ? Design data pipelines for data ingestion, transformation, and loading using tools like Cloud Dataflow or Cloud Dataproc. ? Implement data security best practices and ensure data governance throughout the architecture. � Big Data and Analytics Expertise: ? Wok with big data tools and frameworks like Apache Hadoop, Spark, and Flink on GCP. ? Design and implement data lakes for storing and managing large datasets. ? Integrate data pipelines with machine learning and data analytics tools on GCP. � Cloud Infrastructure Management: ? Manage and optimize GCP infrastructure for data processing and storage, considering factors like cost and scalability. ? Configure and manage cloud security for data resources on GCP. ? Stay updated on the latest GCP data services and tools to leverage them in solutions. � Collaboration and Communication: ? Collaborate with various teams like data scientists, developers, and business stakeholders to understand data needs and translate them into technical solutions. ? Document data architecture plans and designs clearly for future reference and handover. ? May present technical data solutions to stakeholders. Skills and Qualifications (typical, may vary): � Strong understanding of data architecture principles and best practices � Experience with GCP data services like BigQuery, Cloud Dataflow, Cloud Dataproc, etc. � Familiarity with big data frameworks like Hadoop, Spark, Flink (plus) � Experience with cloud security principles � Excellent communication and collaboration skills � Experience working with SQL and data modelling
Posted On
� Designing Data Architecture: ? Analyze data requirements and business needs to design high-level and detailed data architecture plans. ? Select appropriate data storage solutions like BigQuery, Cloud Storage, and Cloud SQL based on data needs (structured, unstructured, etc.). ? Design data pipelines for data ingestion, transformation, and loading using tools like Cloud Dataflow or Cloud Dataproc. ? Implement data security best practices and ensure data governance throughout the architecture. � Big Data and Analytics Expertise: ? Wok with big data tools and frameworks like Apache Hadoop, Spark, and Flink on GCP. ? Design and implement data lakes for storing and managing large datasets. ? Integrate data pipelines with machine learning and data analytics tools on GCP. � Cloud Infrastructure Management: ? Manage and optimize GCP infrastructure for data processing and storage, considering factors like cost and scalability. ? Configure and manage cloud security for data resources on GCP. ? Stay updated on the latest GCP data services and tools to leverage them in solutions. � Collaboration and Communication: ? Collaborate with various teams like data scientists, developers, and business stakeholders to understand data needs and translate them into technical solutions. ? Document data architecture plans and designs clearly for future reference and handover. ? May present technical data solutions to stakeholders. Skills and Qualifications (typical, may vary): � Strong understanding of data architecture principles and best practices � Experience with GCP data services like BigQuery, Cloud Dataflow, Cloud Dataproc, etc. � Familiarity with big data frameworks like Hadoop, Spark, Flink (plus) � Experience with cloud security principles � Excellent communication and collaboration skills � Experience working with SQL and data modelling
Department
UK Europe
Posted On UK Europe Open Positions
1
Posted On 31-Dec- Skills Required
GCP DATA LAKE
Posted On GCP DATA LAKE Location
London
Posted On London Years Of Exp
10.0 to 15.0 Years
Posted On 10.0 to 15.0 Years Posted On 19-May-