AWS Data Engineer @ Ework Group
Popis pozice
Profile:
- Proven experience in Big Data or Cloud projects involving processing and visualization of large and unstructured datasets, across various phases of the SDLC.
- Hands-on experience with the AWS cloud, including Storage, Compute (and Serverless), Networking, and DevOps services.
- Solid understanding of AWS services, ideally supported by relevant certifications.
- Familiarity with several of the following technologies: Glue, Redshift, Lambda, Athena, S3, Snowflake, Docker, Terraform, CloudFormation, Kafka, Airflow, Spark.
- Basic proficiency in at least one of the following programming languages: Python, Scala, Java, or Bash.
- Good command of English; German language skills would be an advantage.
Nice to Have:
- Experience with orchestration tools (e.g., Airflow, Prefect).
- Exposure to CI/CD pipelines and DevOps practices.
- Knowledge of streaming technologies (e.g., Kafka, Spark Streaming).
- Experience working with Snowflake or Databricks in a production or development environment.
- Relevant certifications in AWS, data engineering, or big data technologies.
🔹For our Client we are looking for AWS Data Engineer🔹
LOCATION: 100% remote from Poland (onboarding possible in Warsaw)
,[Tasks:, Design and implement solutions for processing large-scale and unstructured datasets (Data Mesh, Data Lake, or Streaming Architectures)., Develop, optimize, and test modern Data Warehouse (DWH)/Big Data solutions based on the AWS cloud platform within CI/CD environments., Improve data processing efficiency and support migrations from on-premises systems to public cloud platforms., Collaborate with data architects and analysts to deliver high-quality cloud-based data solutions., Ensure data quality, consistency, and performance across AWS services and environments., Participate in code reviews and contribute to technical improvements.] Requirements: AWS, Kafka
¿Te interesa este puesto?