Job details:
- Full-time
- Location: Latvia/Lithuania
- Remote work
- Client: Media group Belgium,
- Longer interview process
Scope:
- We work in a scrum environment, making collaboration and agility key to our success.
- Your day kicks off with the daily stand-up, where you sync up with the team (fellow data & analytics engineers, the scrum master, and the product owner). Together, you discuss the sprint scope, tackle potential blockers, and share relevant updates. Then, it’s time to dive in!
- You’ll be building or optimizing data pipelines and the underlying infrastructure, always keeping an eye on monitoring, efficiency, and cost-effectiveness.
- Somedays, you’ll be working with DBT to model data from an existing pipeline.
- Other times, you’ll assist our functional analyst in pre-refining requests or join meetings like weekly refinements or the sprint retrospective.
- Throughout the day, you stay connected with the team and stakeholders via Jira and Slack.
Skills:
Must have: SQL, Scala, Python, Java, DBT,
- Solid understanding of SQL
- Proven experience in building data pipelines and organizing, structuring, and automating data transformations using DBT (Data Build Tool)
- Strong interest in AWS is essential
- AWS certification (AWS Certified Associate certification)
- Proficiency in Scala, Python, Go, Java, or Shell scripting, or strong database knowledge with a willingness to learn Python and Scala
- Experience with DevOps, including CI/CD pipelines, system setup, and monitoring
- You Build It, You Run It.
- Strong problem-solving DevOps mindset and excellent communication skills
- Strong team spirit and collaboration skills (you enjoy pair programming and other cooperative techniques)
Nice to have:
- Experience with Terraform (or other iac tool) for automated infrastructure
- Knowledge of large-scale distributed systems
- Experience with Domain-driven design
- Expertise in monitoring, logging, and security automation