Data Workflow & Integration Engineer

New Today

Job ID: 25-10461 Job Title: Data Workflow & Integration Engineer Location: San Jose, CA
Duration: 12 months
Contract Type: W2 only
Pay Rate: $83.99/Hour to $84.00/Hour
Role Mandate
Client's Digital Media Voice of Customer team is seeking a highly versatile contractor to design, implement, and maintain robust data workflows that enable structured intake, transformation, integration, and reporting of customer feedback and related datasets. While our current environment leverages Airtable, future work may involve additional platforms. As such, you should bring strong data operations experience, expertise with ETL pipelines, and the ability to quickly learn and deploy new tools. You will be the go-to resource for data architecture, ingestion, transformation, integration, and QA. You will need to ensure data is accurate, accessible, and ready for use in analytics, reporting, and decision-making. Responsibilities
Data Architecture & Schema Design Design scalable relational and/or non-relational database schemas for structured and unstructured datasets Define tables, relationships, metadata standards, and naming conventions to maximize clarity and future scalability Establish data governance rules and documentation Data Ingestion, Transformation & Pipelines Develop and manage ETL/ELT workflows to pull data from various sources (APIs, CSVs, SaaS platforms, cloud storage) using both automated and manual methods Perform data cleaning, normalization, deduplication, and enrichment Prepare datasets for migration to new platforms, ensuring structural and semantic consistency Automate recurring data ingestion and transformation processes when possible Platform & System Implementation Support new software/platform rollouts by preparing data for ingestion, testing migrations, and validating results Configure platform-specific data models, permissions, and workflows Assist with vendor coordination and implementation timelines Workflow Automation & Integration Build and maintain automations to streamline manual processes (status updates, notifications, custom calculations) Integrate disparate platforms (e.g., Airtable, MFST 365, SFDC, BI tools) via native connectors, middleware (Zapier, Power Automate, etc.), or APIs Create alerts, validation rules, and monitoring dashboards to ensure data quality Quality Assurance & Maintenance Conduct regular QA checks for data integrity, completeness, and compliance Troubleshoot schema, automation, and integration issues Maintain version control for workflows and pipelines Stakeholder Collaboration Partner with Voice of Customer, operations, and product teams to gather requirements and align workflows when needed Provide training and support to internal users on Airtable features and best practices Contribute to knowledge base and onboarding materials Required Skills You have full lifecycle experience: from designing schemas and pipelines to QA and post-launch support You can handle large datasets confidently, without sacrificing accuracy or performance You have experience with unstructured data and making sense of it using a structured thought process You're able to pivot quickly when platforms change, requirements shift, or new data sources emerge You understand how to bridge technical systems and business needs You work well with ambiguity, thrive in zero-to-one environments, and proactively propose solutions Minimum Qualifications 5-10 years of experience in data operations, workflow design, or platform integration Proven experience building and maintaining ETL/ELT pipelines across multiple data sources Proficiency with data cleaning and transformation tools (Excel, SQL, Python, or similar) Strong skills in relational database design and metadata management Excellent written and verbal communication, with a focus on clear documentation Bonus Qualifications Experience with Airtable or similar platforms Experience with business intelligence (BI) tools such as Power BI Familiarity with customer feedback, product research, or voice-of-customer workflows Background in platform implementation or data migration projects
Location:
San Jose, CA, United States
Category:
Computer And Mathematical Occupations