Logo
Cadence

Cadence is hiring: Data Warehouse Engineer in San Jose

Cadence, San Jose, California, United States

Save Job

Join to apply for the Data Warehouse Engineer role at Cadence.

At Cadence, we hire and develop leaders and innovators who want to make an impact on the world of technology. A hardworking, highly motivated Data Warehouse Engineer with collaboration skills and the ability to work across multiple domains and software disciplines is required. This experienced, hands‑on developer position requires research, analysis, design, and development of solutions per our business requirements. The role involves working closely with business users, collecting requirements, preparing design documents, developing ETL logic and validation scripts to ensure 100% data accuracy, designing security controls, and deploying and monitoring solutions in production.

Role: Data Warehouse Engineer
Location: San Jose, CA (onsite position)

Key Responsibilities

  • Data Modelling – Designing and developing data models to efficiently store and retrieve information.
  • ETL Development – Creating and optimizing ETL processes to extract, transform, and load data from various sources, ensuring data consistency and quality.
  • Data Pipelines – Designing, implementing, and maintaining data pipelines using Azure Data Factory.
  • Database Design and Implementation – Building relational and multidimensional database structures within the data warehouse.
  • SQL Query Development – Writing efficient SQL queries, stored procedures, and functions to extract, manipulate, and analyze data.
  • Database Performance Optimization – Enhancing database performance through indexing, query tuning, and other optimization techniques.
  • Data Management – Ensuring data integrity, security, and consistency across multiple environments.
  • Troubleshooting and Problem Solving – Identifying and resolving data‑warehouse‑related issues and errors.
  • Reporting and Analysis – Developing and providing reports and dashboards based on data‑warehouse data.

Skills and Qualifications

  • 7+ years of experience in the DW/BI space, including designing models and building analytics, with at least two end‑to‑end implementations.
  • Minimum 2 years of experience in Azure SQL and Azure Data Factory.
  • Strong SQL and relational database proficiency, with experience in SQL Server, Oracle, or other systems.
  • Experience with data modeling, warehousing, and ETL processes.
  • Proficiency in Infor Omni‑Channel Campaign Management – creating jobs and ETL processes in Infor Admin, setting up reports, and managing users and groups.
  • Understanding of IBM WAS (Web Application Server) and related components.
  • Expert-level knowledge of Azure Data Factory features, capabilities, and best practices.
  • Mastery of Azure SQL stack architecture (SQL Server, SQL Warehouse, Azure Data Factory).
  • Knowledge of data analysis principles and reporting techniques.
  • Preferred knowledge of Python, Pentaho, or shell scripting.
  • Strong problem‑solving and analytical skills, with the ability to resolve complex data warehouse issues.
  • Excellent communication skills, able to convey technical concepts to technical and non‑technical audiences.
  • Functional knowledge of common business processes such as quote‑to‑cash, clickstream analysis, revenue, booking, and billing.

Education: Bachelor’s degree in computer science, a related field, or equivalent experience.

We’re doing work that matters. Help us solve what others can’t.

Referrals increase your chances of interviewing at Cadence by 2x.

#J-18808-Ljbffr