Veracity
Job Title: Data Engineer
Duration: 12 Months (Contract) Locations: Charlotte, NC / Phoenix, AZ / Minneapolis, MN
Job Description:
We are seeking an experienced
Data Engineer
to join our team and contribute to the design, development, and optimization of data solutions supporting enterprise-level initiatives. The successful candidate will work on large-scale data processing, transformation, and analytics using cutting-edge cloud and ETL technologies.
Responsibilities: Design, build, and maintain scalable data pipelines on
GCP
(Google Cloud Platform). Develop and optimize queries in
BigQuery
to support analytics and reporting. Implement data transformation and ETL workflows using
Ab Initio . Work with
Teradata
to integrate and manage large datasets. Ensure data quality, integrity, and security across all solutions. Collaborate with data architects, business analysts, and application teams to meet business requirements. Contribute to performance tuning, error handling, and automation of data processes. Required Skills:
Strong hands-on experience with
GCP
and
BigQuery . Proficiency in
Ab Initio ETL
development. Experience working with
Teradata
databases. Desired Skills:
Experience with
Java
or
Python
for data processing and automation. Prior work in
Finance
or
Banking
domain is a plus.
Duration: 12 Months (Contract) Locations: Charlotte, NC / Phoenix, AZ / Minneapolis, MN
Job Description:
We are seeking an experienced
Data Engineer
to join our team and contribute to the design, development, and optimization of data solutions supporting enterprise-level initiatives. The successful candidate will work on large-scale data processing, transformation, and analytics using cutting-edge cloud and ETL technologies.
Responsibilities: Design, build, and maintain scalable data pipelines on
GCP
(Google Cloud Platform). Develop and optimize queries in
BigQuery
to support analytics and reporting. Implement data transformation and ETL workflows using
Ab Initio . Work with
Teradata
to integrate and manage large datasets. Ensure data quality, integrity, and security across all solutions. Collaborate with data architects, business analysts, and application teams to meet business requirements. Contribute to performance tuning, error handling, and automation of data processes. Required Skills:
Strong hands-on experience with
GCP
and
BigQuery . Proficiency in
Ab Initio ETL
development. Experience working with
Teradata
databases. Desired Skills:
Experience with
Java
or
Python
for data processing and automation. Prior work in
Finance
or
Banking
domain is a plus.