Data Engineer - Remote / Telecommute

New Today

Job Description: Core Technical Skills: Advanced SQL (Snowflake, Databricks): Table management, deprecation, data querying. Python: Scripting for automation, ETL workflows, alert tooling. Airflow: DAG creation, dependency management, alert tuning. Version Control & CI/CD: Git, deployment pipelines, code reviews Monitoring & Observability. Monte Carlo (MC): Alert configuration, suppression, false positive reduction. Observability Tooling: Integration with Airflow, Datadog, or similar tools. Root Cause Analysis: Debugging alert triggers and noisy pipelines. Platform Migration And Pipeline Stability: Legacy to Modern Platform Migration: (e.g., RDE → Alchemist or Data Infra). Upstream Dependency Debugging: Identify and resolve R+ or external data source failures. Pipeline Ownership Handoff: Documentation, transfer of Gold and People Analytics pipelines. Process And Documentation: Technical Documentation: Wikis, runbooks, alert resolution docs. Cross-team Collaboration: Working with FDE, Data Infra, Storefront, CX, GCO. Data Governance Awareness: Ownership models, process compliance, alert accountability. Data Infrastructure Hygiene. Table Deprecation & Cleanup: Snowflake, Salesforce, unused pipelines. Alert Consolidation: Eliminate redundant monitors and streamline alerting logic.
Location:
Dallas