Logo
INSPYR Solutions

Senior Data Engineer

INSPYR Solutions, Glendale, California, us, 91222

Save Job

Title: Senior Data Engineer Industry: Entertainment Location: Glendale, CA Duration: + months Rate Range: $- Work Requirements: US Citizen, GC Holders or Authorized to Work in the Description:

As a Senior Data Engineer, you will play a pivotal role in the transformation of data into actionable insights.

Collaborate with our dynamic team of technologists to develop cutting‑edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions.

Your expertise in data engineering will be crucial in optimizing our data‑driven decision‑making processes.

Key Responsibilities:

Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines

Build tools and services to support data discovery, lineage, governance, and privacy

Collaborate with other software / data engineers and cross‑functional teams

Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Kubernetes and AWS

Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform

Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more

Ensure high operational efficiency and quality of the Core Data platform datasets to ensure our solutions meet SLAs and project reliability and accuracy to all our stakeholders (Engineering, Data Science, Operations, and Analytics teams)

Be an active participant and advocate of agile / scrum ceremonies to collaborate and improve processes for our team

Engage with and understand our customers, forming relationships that allow us to understand and prioritize both innovative new offerings and incremental platform improvements

Maintain detailed documentation of your work and changes to support data quality and data governance requirements

Basic Qualifications:

Years of data engineering experience developing large data pipelines

Proficiency in at least one major programming language (Python, Java, Scala)

Strong SQL skills and ability to create queries to analyze complex datasets

Hands‑on production environment experience with distributed processing systems such as Spark

Hands‑on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines

Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query).

Experience in developing APIs with GraphQL

Deep understanding of AWS or other cloud providers as well as infrastructure as code

Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices

Strong algorithmic problem‑solving expertise

Advanced understanding of OLTP vs OLAP environments

Strong background in at least one of the following: distributed data processing or software engineering of data services, or data modeling

Familiar with Scrum and Agile methodologies

Required Education:

BA / BS Degree Comp Sci / IS or related field

Our benefits package includes:

Comprehensive medical benefits

Competitive pay

(k) retirement plan

much more!

#J-18808-Ljbffr