Logo
INSPYR Solutions

Data Operations Engineer

INSPYR Solutions, Washington, District of Columbia, us, 20022

Save Job

Position Details

Title: Data Operations Engineer Location: Washington D.C. - Hybrid Duration: 6+ Month Contract (with extension) Pay Range: $55/hr - $62.50/hr Position Summary

The Data Operations Engineer oversees the development, management, and optimization of a cloud data platform on Google Cloud Platform (GCP). This role ensures secure, efficient, and scalable data solutions and is instrumental in establishing and maintaining the data infrastructure that supports advanced analytics and research on workforce development, higher education, labor market, and economic data for state partners. Key Responsibilities

Manage data ingestion pipelines, optimize system performance, implement data governance policies, and provide technical leadership to support data-driven decision making. Operate a data environment leveraging GCP services such as BigQuery, Cloud Run, Cloud Storage, and Vertex AI. Collaborate with stakeholders with experience in comparable cloud data ecosystems (e.g., Snowflake, Amazon Redshift, Databricks) and pivot to GCP tooling as needed. Support the Education Analytics and Technical Services team within the Employer Alignment focus area, ensuring data infrastructure meets strategic needs. Detailed Responsibilities

Data Management (40%) Cloud Data Platform Administration: Maintain and optimize the Google Cloud-based data warehouse and storage (BigQuery, Cloud Storage) for secure, efficient, and scalable data solutions. ETL/ELT Pipeline Development: Develop and orchestrate scalable ETL/ELT pipelines for data ingestion and transformation using GCP services (e.g., Cloud Run, Dataflow). Data Quality & Governance: Ensure data integrity, quality, and governance, including data security and access controls and regulatory compliance. Third-Party Data Integration: Integrate third-party data sources (e.g., via APIs and data marketplaces) with internal research teams and state agency partners; onboard new data sources to expand the data model. Performance Optimization: Optimize data structures, partitioning, and query strategies in BigQuery; monitor and tune resource usage for cost-effective operations. System Administration and Performance Optimization (30%) Monitoring & Troubleshooting: Monitor data services for high availability and responsiveness; resolve issues across GCP services. Resource & Cost Management: Manage storage resources, query performance, and workload scheduling; implement cost-effective strategies in the GCP environment. Automation & DevOps: Automate data workflows and deployments (Infrastructure-as-Code and CI/CD where possible); maintain documentation for data models and integrations. Data Collaboration and Reporting Support (20%) Analytics Enablement: Provide data access and support to analysts and researchers; ensure data is accessible and ready for insights. Data Quality Collaboration: Drive data quality improvements with stakeholders; establish feedback loops for data definitions and accuracy. Dashboard and App Support: Assist in developing analytics dashboards and apps; support BI tools like Tableau and simple analytics apps by provisioning data and optimizing queries. Alignment with Analytics Needs: Ensure data infrastructure supports analytic capabilities; adjust data models or pipelines to enable new metrics and insights. Team Leadership & DEI Commitment (10%) Technical Leadership: Guide team on data warehousing, engineering, and governance; promote data literacy and efficient use of the data platform. Diversity, Equity & Inclusion: Promote equitable workplace practices and diverse perspectives in data operations. Collaborative Culture: Foster a collaborative, inclusive environment with cross-functional teamwork and transparent, respectful interactions. Qualifications and Experience

Education: Bachelor's degree in computer science, data engineering, information systems, or related field (or equivalent work experience). Master’s degree is a plus. Cloud Data Platform Expertise: 5+ years of experience managing cloud-based data platforms in enterprise environments; preference for Google Cloud Platform (BigQuery, Cloud Storage). Experience with Snowflake, Redshift, or Databricks is valued with willingness to pivot to GCP. Data Modeling & Architecture: Proven ability to design data models that support research, dashboarding, and analytics; optimize data workflows for large datasets. Programming & Scripting: Proficiency in SQL; Python (or similar) for data engineering and automation. ETL/ELT & Integration: Hands-on experience with ETL/ELT pipelines and cloud data integration; experience with third-party data from APIs and data marketplaces. CI/CD: Familiarity with Terraform, Cloud Build, GitHub Actions, or Jenkins for deployment automation. Data Governance & Security: Experience managing security and access controls; knowledge of FERPA, CCPA, SOC 2, ISO 27001 related to education and labor contexts. Version Control: Proficient with Git. Analytical Tools: Experience with Tableau or Power BI; ability to support analytics apps and dashboards. Domain Experience: Experience with state-level workforce, education, or economic datasets is a plus. Soft Skills: Strong problem-solving, communication, and collaboration; experience with agile project management for data infrastructure. Certifications: Google Cloud Professional Data Engineer and/or Cloud Architect certification preferred. EEO Statement

INSPYR Solutions provides Equal Employment Opportunities (EEO) to all employees and applicants regardless of race, color, religion, national origin, age, disability, genetics, or other protected characteristics. INSPYR complies with applicable federal, state, and local nondiscrimination laws. About INSPYR Solutions

Technology is our focus and quality is our commitment. We deliver flexible technology and talent solutions aligned with clients’ objectives. Learn more at inspyrsolutions.com.

#J-18808-Ljbffr