Senior AWS Data Engineer
New Yesterday
Benefits:
Hybrid
Competitive salary
Opportunity for advancement
Job Title: Senior AWS Data Engineer
Location: Dallas, TX (Hybrid 3 days onsite)
Experience: 812 years
Interview Process: In-Person
Profiles: Locals & Non-Locals Can Apply
Overview
We are looking for an experienced AWS Data Engineer with strong expertise in ETL, cloud migration, and large-scale data engineering.
The ideal candidate is hands-on with AWS, Python/PySpark, and SQL, and can design, optimize, and manage complex data pipelines.
This role requires collaboration across teams to deliver secure, scalable, and high-quality data solutions that drive business intelligence and operational efficiency.
Key Responsibilities
- Design, build, and maintain scalable ETL pipelines across AWS and SQL-based technologies.
- Assemble large, complex datasets that meet business and technical requirements.
- Implement process improvements by re-architecting infrastructure, optimizing data delivery, and automating workflows.
- Ensure data quality and integrity across multiple sources and targets.
- Orchestrate workflows with Apache Airflow (MWAA) and support large-scale cloud migration projects.
- Conduct ETL testing, apply test-driven development (TDD), and participate in code reviews.
- Monitor, troubleshoot, and optimize pipelines for performance, reliability, and security.
- Collaborate with cross-functional teams and participate in Agile ceremonies (sprints, reviews, stand-ups).
Requirements
- 812 years of experience in Data Engineering, with deep focus on ETL, cloud pipelines, and Python development.
- 5+ years of hands-on coding with Python (primary), PySpark, and SQL.
- Proven experience with AWS services: Glue, EMR (Spark), S3, Lambda, ECS/EKS, MWAA (Airflow), IAM.
- Experience with AuroraDB,DynamoDB Redshift, and AWS Data Lakes.
- Strong knowledge of data modeling, database design, and advanced ETL processes (including Alteryx).
- Proficiency with structured and semi-structured file types (Delimited Text, Fixed Width, XML, JSON, Parquet).
- Experience with ServiceBus or equivalent AWS streaming/messaging tools (SNS, SQS, Kinesis, Kafka).
- CI/CD expertise with GitLab or similar, plus hands-on Infrastructure-as-Code (Terraform, Python, Jinja, YAML).
- Familiarity with unit testing, code quality tools, containerization, and security best practices.
- Solid Agile development background, with experience in Agile ceremonies and practices.
Flexible work from home options available.
- Location:
- Dallas
- Salary:
- $60 - $65 per hour
- Category:
- Technology