Logo
Ascent Funding, LLC

Staff Data Engineer (#SC-0111)

Ascent Funding, LLC, WorkFromHome

Save Job

Responsibilities

  • Design, expand, and optimize data pipeline architecture to support the critical data needs of cross-functional teams, systems, and products.
  • Build, maintain, and enhance the infrastructure for optimal data Extraction, Transformation, and Loading (ETL) and support the transition to ELT pipelines using tools like dbt.
  • Implement and optimize data workflows and pipelines leveraging Amazon Web Services (AWS) technologies, including EC2, Glue, Redshift, Relational Database Service (RDS), S3, Database Migration Service (DMS), Athena, CloudWatch, and QuickSight.
  • Develop and operationalize machine learning models using AWS SageMaker and integrate these models into data pipelines to deliver actionable business insights.
  • Manage and enhance SQL Server Integration Services (SSIS) workflows for data integration across various sources, ensuring scalability and efficiency.
  • Work with cross-platform database technologies, including MS SQL, Postgres, MySQL, and NoSQL databases like DynamoDB, to optimize data architecture and access.
  • Write advanced Python scripts for data manipulation, pipeline development, and process automation.
  • Utilize cross-platform file transfer and storage systems such as FSx, SharePoint, and Secure File Transfer Protocol (SFTP) to enable efficient and secure data movement.
  • Administer and optimize databases through performance tuning, index creation, and query optimization to enhance system efficiency.
  • Use version control systems such as Git and implement common source control development patterns to manage codebase changes effectively.
  • Employ Terraform or other infrastructure-as-code tools to automate infrastructure deployment and scaling.
  • Employer offers optional hybrid work from home within commuting distance to the office.

Job offered by Ascent Holding, Co.

Education & Other Minimum Requirements

  • Bachelor’s degree or equivalent in Computer Science, Electrical Engineering, Data Engineering, or a related field.
  • Seven years of post-Bachelor’s progressive experience in data engineering. Employer will accept a Master’s degree or equivalent in Computer Science, Electrical Engineering, Data Engineering, or a related field and five years of experience in data engineering in lieu of a Bachelor’s degree and seven years of progressive experience.
  • Work experience to include: 1) Seven years (or five years if qualifying with a Master’s degree) of data engineering experience in a senior-level role, to include delivering large-scale data solutions; 2) Working with AWS technologies, including EC2, Glue, Redshift, RDS, S3, DMS, Athena, CloudWatch, QuickSight, and SageMaker; 3) Designing, building, and optimizing scalable ETL and ELT pipelines and data integration workflows using SQL, Python, SSIS, AWS Glue, and dbt; 4) Working with cross-platform database technologies, including MS SQL, Postgres, MySQL, and NoSQL databases like DynamoDB; 5) Working with encryption algorithms and implementing data encryption at rest and in transit using tools including PGP; 6) Working with database administration and optimization techniques, including tuning, index creation, and query optimization; 7) Utilizing Git and source control development patterns for version management; 8) Working with infrastructure-as-code tools including Terraform to automate and manage cloud resources; 9) Working with data governance, security, and compliance principles to ensure data integrity and minimize risk.

Any and all experience may be gained concurrently.

Salary: $180,502.00 per year.

How to Apply

Interested candidates should either apply to this career page posting or email resumes to Ascent Holding Co. email at referencing Job #SC-0111.

Equal Employment Opportunity

It is the policy of the Company to provide equal employment opportunities to all employees and employment applicants without regard to unlawful considerations of race, religion, color, national origin, sex, sexual orientation, gender identity or expression, age, sensory, physical, or mental disability, marital status, veteran or military status, genetic information, or any other classification protected by applicable local, state, or federal laws. This policy applies to all aspects of employment, including, but not limited to, hiring, job assignment, compensation, promotion, benefits, training, discipline, and termination. Reasonable accommodation is available for qualified individuals with disabilities, upon request.

#LI-DNI

Do you now, or will you in the future, require sponsorship for employment authorization (e.g., H-1B, TN, J-1, etc.) to work for our Company in the United States?

* Yes No

1) Do you have a Bachelor’s degree or equivalent in Computer Science, Electrical Engineering, Data Engineering, or a related field?

* Yes No

If yes, please list the date you received your degree and the degree field.

Do you have seven (7) years of post-Bachelor’s progressive experience in data engineering?

* Yes No

Do you have a Master’s degree or equivalent in Computer Science, Electrical Engineering, Data Engineering, or a related field?

* Yes No

If yes, please list the date you received your degree and the degree field.

Do you have five (5) years of experience in data engineering?

* Yes No

Does your work experience include seven years (or five years if qualifying with a Master’s degree) of data engineering experience in a senior-level role:

* Yes No

Does your work experience include delivering large-scale data solutions (in a senior-level data engineering role):

* Yes No

2) Does your work experience include working with EC2 (AWS Technology) Glue (AWS technology) Redshift (AWS technology) RDS (AWS technology) S3 (AWS technology) DMS (AWS technology) Athena (AWS technology) CloudWatch (AWS technology) QuickSight (AWS technology) SageMaker (AWS technology)

3) Does your work experience include Designing scalable ETL pipelines and data integration workflows Building scalable ETL pipelines and data integration workflows Optimizing scalable ETL pipelines and data integration workflows Designing scalable ELT pipelines and data integration workflows Building scalable ELT pipelines and data integration workflows Optimizing scalable ELT pipelines and data integration workflows

Designing, building, and optimizing scalable ETL and ELT pipelines and data integration workflows using… SQL Python SSIS AWS Glue dbt

4) Does your work experience include MS SQL (cross-platform database technology) Postgres (cross-platform database technology) MySQL (cross-platform database technology) NoSQL databases DynamoDB (NoSQL database)

5) Does your work experience include encryption algorithms Implementing data encryption at rest Implementing data encryption in transit Using PGP (to implement data encryption)

6) Does your work experience include database administration optimization techniques tuning (database administration and optimization techniques) index creation (database administration and optimization techniques) query optimization (database administration and optimization techniques)

7) Does your work experience include Utilizing Git for version management Utilizing source control development patterns for version management

8) Does your work experience include Working with infrastructure-as-code tools including Terraform to automate cloud resources Working with infrastructure-as-code tools including Terraform to manage cloud resources

9) Does your work experience include Working with data governance to ensure data integrity and minimize risk Working with security to ensure data integrity and minimize risk Working with compliance principles to ensure data integrity and minimize risk

Do you work commuting distance to the office ? (Located in downtown San Diego)

* Yes No

If (1) above is NO, would you be willing to relocate at your own expense? Please note that Ascent Holding, Co. will not cover relocation costs.

#J-18808-Ljbffr