LinkedIn y terceros utilizan cookies imprescindibles y opcionales para ofrecer, proteger, analizar y mejorar nuestros servicios, y para mostrarte publicidad relevante (incluidos anuncios profesionales y de empleo) dentro y fuera de LinkedIn. Consulta más información en nuestra Política de cookies.
Selecciona Aceptar para consentir o Rechazar para denegar las cookies no imprescindibles para este uso. Puedes actualizar tus preferencias en cualquier momento en tus ajustes.
Location: LATAM (preferably Argentina CABA/AMBA region)
Job Type: Remote (hybrid expected for CABA/AMBA)
Project: Data Infrastructure Modernization
Time Zone: GMT-3 (Buenos Aires time)
English Level: B2 / C1
Get to Know Us
At Darwoft, we build digital products with heart. Were a Latin American tech company focused on creating impactful, human-centered software in partnership with companies around the globe. Our remote-first culture is built on trust, continuous learning, and collaboration.
Were ionate about tech, but even more about people. If youre looking to a team where your ideas matter and your impact is real welcome to Darwoft.
Were Looking For a Data Engineer
Were looking for a skilled Data Engineer to help lead our migration from on-premise data processes to a scalable cloud-native environment. Youll design and optimize ETL/ELT pipelines using modern tools, and play a key role in ensuring efficiency, scalability, and high-quality data integration as we move to the cloud.
Note: Candidates located in CABA/AMBA will be expected to attend some in-person ceremonies.
What Youll Be Doing
Design, develop, and optimize ETL/ELT processes using NiFi, AWS Glue, EMR, or Informatica Cloud.
Migrate and adapt existing ETL workflows from on-premise to cloud infrastructure.
Ingest and process unstructured data into Data Lakes, ensuring seamless integration with other data sources.
Implement and fine-tune scalable data workflows with Apache Spark or PySpark.
Develop automation and data processing scripts in Python as part of robust data pipelines.
Collaborate closely with solution architects and cross-functional teams to ensure smooth cloud transitions.
Build cloud-based pipelines (preferably on AWS) with scalability and performance in mind.
Maintain comprehensive documentation of data architecture and processes for clarity and .
What You Bring
Proven experience developing ETL/ELT jobs in both on-premise and cloud environments.
Hands-on experience with tools such as NiFi, AWS Glue, EMR, or Informatica Cloud.
Solid background in migrating data workflows to the cloud.
Familiarity with ingesting and transforming unstructured data within Data Lakes.
Proficiency in Apache Spark or PySpark for distributed data processing.
Strong Python scripting skills for automation within data pipelines.
Cloud experience, ideally with AWS (other cloud platforms are a plus).
Strong analytical and problem-solving skills with a focus on optimizing complex data workflows.
Nice to Have
AWS or other cloud platform certifications.
Experience with other data orchestration/integration tools.
Background in large-scale migration or cloud optimization projects.
Understanding of data governance and security in cloud environments.
Perks & Benefits
Full-time contract with payment in ARS
100% remote work
Competitive salaries
Legal leave and vacation days
5 extra personal days off per year
Access to top learning platforms
Benefits and discounts card
Welcome kit
Reimbursement programs
English classes
Referral program
Birthday gift
Healthy Break
Darwoft-style celebrations: anniversaries, year-end parties, birthdays, and fun team-building events
Nivel de antigüedad
No corresponde
Tipo de empleo
Jornada completa
Función laboral
Tecnología de la información
Sectores
Desarrollo de software
Las recomendaciones duplican tus probabilidades de conseguir una entrevista con Darwoft
Queremos impulsar los conocimientos de la comunidad de una forma nueva. Los expertos añadirán información directamente a cada artículo, generado inicialmente con inteligencia artificial.