AWS Data Engineer

New Today

Benefits: Hybrid Long Term Opportunity for advancement
Job Title: AWS Data Engineer Location: Dallas, TX (Hybrid 3 days per week in-office) Interview Process: In-Person Interview Profiles Required: 812 Years of Experience Locals & Non-Locals Can Apply
Core Skills: Strong AWS Services, Python/PySpark, Advanced SQL Submission Details: Resume, LinkedIn URL, Work Authorization, Location Details
Our esteemed client is seeking an experienced AWS Data Engineer with strong expertise in ETL testing, cloud migration, and Python programming in a production-grade AWS environment. You will design and maintain scalable data pipelines, ensure data quality through rigorous ETL testing, and play a key role in large-scale cloud migration initiatives. This role is ideal for someone passionate about building efficient, secure, and reliable data solutions, applying functional design principles, and leveraging automation to deliver business impact.
Key Responsibilities - Develop and maintain scalable ETL pipelines within the AWS ecosystem. - Conduct ETL testing to ensure data integrity, accuracy, and performance. - Lead and support large-scale cloud migration projects. - Use Python (primary language) along with SQL and PySpark to develop data processing and automation solutions. - Orchestrate data workflows using Apache Airflow (including MWAA). - Collaborate with cross-functional teams to design data models and implement industry-standard data security and classification methodologies. - Monitor, troubleshoot, and optimize pipelines for reliability and performance. - Document design, conduct code reviews, and apply test-driven development (TDD) practices. - Work in Agile environments, participating in sprint planning and daily stand-ups. - Ensure a smooth transition of data during cloud migrations, applying DevOps and CI/CD practices where needed
Qualifications - 812 years of experience in Data Engineering, focusing on ETL, Cloud Migration, and Python development. - 5+ years of hands-on experience with Python (primary), SQL, and PySpark, applying functional design principles and software design patterns. - 5+ years building and deploying ETL pipelines across on-prem, hybrid, and cloud environments, with orchestration experience using Airflow. - 5+ years of production-level AWS experience (MWAA, Glue/EMR (Spark), S3, ECS/EKS, IAM, Lambda). - Solid understanding of core statistical principles, data modeling, and data security best practices. - 5+ years working in Agile development with unit testing, TDD, and design documentation. - 2+ years of DevOps/CI-CD experience, including Terraform or other Infrastructure-as-Code (IaC) platforms. - Excellent problem-solving skills, with the ability to troubleshoot complex data and performance issues. - Strong communication skills and ability to work effectively in a hybrid model (3 days onsite in Dallas).
Flexible work from home options available.
Location:
Dallas
Salary:
$60 - $65 per hour
Category:
Technology

We found some similar jobs based on your search