Deel dit
We are looking for an AWS Data Engineer for a partner of Gravitas!
Project details:
Duration: 12 months with an option to extend
Location: North Holland / 2 days a week at the office is mandatory
Hours: 36-40 hours per week
This is what you'll be working on:
- Design, develop, and maintain data pipelines, ensuring the reliable ETL of data from diverse sources.
- Utilize Databricks and Apache Spark for large-scale data processing.
- Collaborate with fellow engineers, data scientists and analysts to provide clean, accessible datasets.
- Work on the development and optimization of the central data platform.
- Implement real-time data streaming solutions using Kafka.
- Champion CI/CD practices for streamlined ETL pipelines and infrastructure deployment.
Who our partner is looking for:
- Proven experience as a Data Engineer in a complex data environment.
- Proficiency with Databricks and Apache Spark.
- Expertise in AWS services and cloud-based architecture.
- Strong Python and SQL scripting skills.
- Proficiency in Kafka for real-time data streaming.
- Collaborative team player with strong interpersonal skills.
- Ambition to take responsibility and deliver high-quality results.
If you want more information about this project, please contact me below.
c.maximiano@gravitasgroup.com
0657219366
Deel dit