Logo
GM Financial

Data Engineer II - ABS Data Analytics & Finance Reporting

GM Financial, Fort Worth, Texas, United States, 76102

Save Job

Data Engineer II - ABS Data Analytics & Finance Reporting

Join to apply for the

Data Engineer II - ABS Data Analytics & Finance Reporting

role at

GM Financial Overview

GM Financial is the wholly owned captive finance subsidiary of General Motors and is headquartered in Fort Worth, U.S. We are a global provider of auto finance solutions with operations in North America, South America and the Asia Pacific region. Through our relationships with auto dealers, we offer attractive retail financing and lease programs and commercial lending products to dealers to help them finance and grow their businesses. At GM Financial, our team members define and shape our culture — an environment that welcomes new ideas, fosters integrity and creates a sense of community and belonging. Here we do more than work — we thrive. Responsibilities

About the role:

The ABS Data Engineer II is a critical technical role within the GMF North America Securitization and Conduit Reporting team. This position helps the ABS reporting team in building and maintaining reliable and scalable data pipelines. It leverages Python, SQL, and Azure cloud technologies to extract, transform, and load data efficiently, enabling seamless data access and analysis for accounting business users. This role involves coordination with other departments and third-party software vendors. Job Duties

Work internal business partners to identify, capture, collect, and format data from external sources, internal systems and the data warehouse to extract features of interest

Contribute to the evaluation, research, and experimentation efforts with batch and streaming data engineering technologies in a lab to keep pace with industry innovation

Work with data engineering related groups to inform on and showcase capabilities of emerging technologies and enable adoption of these technologies and techniques

Coordinate with Privacy Compliance to ensure proper data collection and handling

Create and implement business rules and functional enhancements for data schemas and processes

Perform data load monitoring and resolution

Work with internal business clients to problem solve data availability and activation issues

Qualifications

What makes you a dream candidate?

Experience with processing large data sets using Hadoop, HDFS, Spark, Kafka, Flume or similar distributed systems

Experience with ingesting various source data formats such as JSON, Parquet, SequenceFile, Cloud Databases, MQ, Relational Databases such as Oracle

Experience with Cloud technologies (Azure, AWS, GCP) and native toolsets such as Azure ARM Templates, Hashicorp Terraform, AWS Cloud Formation

Understanding of cloud computing technologies, business drivers and emerging computing trends

Thorough understanding of Hybrid Cloud Computing: virtualization technologies, IaaS, PaaS and SaaS and the current competitive landscape

Working knowledge of Object Storage technologies (Data Lake Storage Gen2, S3, Minio, Ceph, ADLS, etc.)

Experience with containerization (Docker, Kubernetes, Spark on Kubernetes, Spark Operator)

Working knowledge of Agile development / SAFe Scrum and Application Lifecycle Management

Strong background with source control management systems (GIT or Subversion); Build Systems (Maven, Gradle, Webpack); Code Quality (Sonar); Artifact Repository Managers (Artifactory); CI/CD (Azure DevOps)

Experience with NoSQL data stores such as CosmosDB, MongoDB, Cassandra, Redis, Riak or other technologies that embed NoSQL with search such as MarkLogic

Creating and maintaining ETL processes

Knowledgeable of best practices in information technology governance and privacy compliance

Experience with Adobe solutions (ideally Adobe Experience Platform, DTM/Launch) and REST APIs

Troubleshoot complex problems and work across teams to meet commitments

Excellent computer skills and proficiency in digital data collection

Ability to work in an Agile/Scrum team environment

Strong interpersonal, verbal, and writing skills

Digital technology solutions (DMPs, CDPs, Tag Management Platforms, Cross-Device Tracking, SDKs, etc.)

Knowledge of Real Time-CDP and Journey Analytics solutions

Understanding of big data platforms and architectures, data stream processing pipeline/platform, data lake and data lake houses

SQL experience: querying data and sharing insights

Understanding of cloud solutions such as Google Cloud Platform, Microsoft Azure & Amazon AWS cloud architecture & services

Understanding of GDPR, privacy & security topics

Experience And Education

What we require:

2-4 years of hands on experience with data engineering required

Bachelor’s Degree in related field or equivalent experience required

What We Offer

Generous benefits package available on day one to include: 401K matching, bonding leave for new parents (12 weeks, 100% paid), tuition assistance, training, GM employee auto discount, community service pay and nine company holidays. Our Culture

Our team members define and shape our culture — an environment that welcomes innovative ideas, fosters integrity, and creates a sense of community and belonging. Here we do more than work — we thrive. Compensation

Competitive pay and bonus eligibility Work Life Balance

Flexible hybrid work environment, minimum of 2-days a week in office in Fort Worth, Texas

#J-18808-Ljbffr