Data Engineer - W2 only - LOCALS PREFERRED
New Today
w2 only
AWS must have S3, EMR
Terraform
Python Must have
Spark or PySpark strongly preferred.
Will also be working with:
Batch data processing large amounts - TBs
Snowflake, SNS, SQS, Redshift, SQL (nice to have***)
Kubernetes
Java
DevOps / DataOps skill set. Looking for someone to manage AWS / IaC with Terraform and build data and CI/CD pipelines using Python. Quick learner a lot of projects coming.
Will also be working with:
Batch data processing large amounts - TBs
Snowflake, SNS, SQS, Redshift, SQL (nice to have***)
Kubernetes
Java
- Location:
- Wilmington, DE, United States
- Category:
- Computer And Mathematical Occupations
We found some similar jobs based on your search
-
New Today
Data Engineer - W2 only - LOCALS PREFERRED
-
Wilmington
w2 only AWS must have S3, EMR Terraform Python Must have Spark or PySpark strongly preferred. Will also be working with: Batch data processing large amounts - TBs Snowflake, SNS, SQS, Redshift, SQL (nice to have***) Kubernetes Java DevOps / DataOp...
More Details -
-
New Today
Data Engineer - W2 only - LOCALS PREFERRED
-
Wilmington, DE, United States
- Computer And Mathematical Occupations
w2 only AWS must have S3, EMR Terraform Python Must have Spark or PySpark strongly preferred. Will also be working with: Batch data processing large amounts - TBs Snowflake, SNS, SQS, Redshift, SQL (nice to have***) Kubernetes Java DevOps / DataOp...
More Details -