CONEXIONHR

ID 3911 – Data Engineer

Job Category: Data
Job Location: LATAM

Join as a Data Developer and contribute to building and maintaining data solutions in AWS for one of our largest clients. You’ll design, develop, test, and deploy new features, working with large datasets, Python, Spark, and various AWS services. Mentor others, lead projects, and impact our team’s success.
We are looking for Experienced Data Engineers to help a large customer migrate legacy ETL jobs (currently Glue/Lambda/Snowflake for business systems) to their Airflow/Databricks standard.

Requirements:
● Level Of Experience – 10+ years.
● 8+ years of data engineering experience.
● 4+ years of experience with Python.
● 4+ years of experience in Airflow.
● 4+ years of experience in Databricks and PySpark.
● 2+ years of experience in AWS Glue.
● 2+ years of experience in Snowflake.
● 4+ years of experience with SQL. Some NoSQL.
● Experience processing large datasets.
● CI/CD experience (Git, Jenkins)
● Experience with databases (Oracle, Redshift, Aurora).
● AWS Compute, Database, and Management tools experience.
● Skills Required – ETL, Data flows, ingestion, reporting, analytics, general communication and technical troubleshooting.
● Primary Technology Stack – AWS, Glue, Snowflake, Airflow, Databricks, ETL, Lambda.
● Primary Responsibilities – The role is to help migrate a number of old ETL jobs (Lambda, Glue, Snowlfake) into the customer’s new ETL standard (Airflow + Databricks)

Nice to have:
● Java/Spring Boot web app development experience.
● Microservices/RESTful API experience.
● AWS Certification/Cloud migration experience.

Benefits:
● 100% remote.
● USD payment.
● 4 weeks of vacations and 10 local holidays paid.

Great, just keep talking to your recruiter.

Apply for this position

If you are already talking to a recruiter from CONEXIONHR, DON'T FILL THE FORM.

Allowed Type(s): .pdf, .doc, .docx
en_US
💬 ¿Necesitas ayuda?