Job details:
- 2 open positions
- Full-time
- Remote
- 1st contract: June to November 2024
- Extensions possible
- Candidates located in Poland
Scope:
Client needs help to establish an infrastructure foundation on Google Cloud Platform (GCP) and migrate their current legacy data lake to a manageable, scalable, and secure data lake in the cloud that can provide the data analytics capabilities necessary to meet their growing needs.
- Assess, design, plan, and migrate the existing on-premise or cloud data lake to Google Big Query
- Assess the current system including Flows, Data pipelines, Schemas and reports. Focus on the following areas: Data Governance leading practices, definitions, guidelines, process, and recommendations for products available in GCP: Data Catalog, Data Lineage, Data Quality, Data Masking, Data Classification, Datasets and analytics infrastructure, Database and application technology
- Extract, transform and load (ETL) or extract, load, and transform (ELT) workloads,
- Job orchestration and scheduling needs,
- Tooling supply plan (including supporting end of life),
- Plans for continuous integration between applications, Business units and other teams using data solutions,
- Desired future-state for data transformation, data lake, or data analytics, or data analytics
Methodology: Agile
Skills:
- Bachelor's degree in business, computer science, or a related field
- Experience in below technologies (minimum 3 years)
- BigQuery
- Data Flow
- SQL
- Python
- Experiance in migration data warehouses to BigQuery
- English required