Invitation Homes
Data Architect role at Invitation Homes
Invitation Homes is nation’s premier home leasing company, pioneering a new industry supported by advanced and robust technology solutions to enhance the resident experience. We are looking for innovative, dynamic individuals who are passionate about building business focused technology solutions using best of the breed tech stacks and take the platform to the next level.
The Data Architect will play a key role in designing, implementing, and maintaining robust data infrastructure to support our organization's data-driven initiatives. Candidate should have a strong background in data engineering, with expertise in data modeling, data processing, python programming, and database SQL/PLSQL scripting encompassing procedures, functions, dynamic programming etc. This role is crucial in shaping the architecture of our data platform solutions by building and optimizing data models, and standardizing and implementing efficient ETL processes. Furthermore, active involvement in database management, ensuring data quality and governance, and harnessing programming skills for automation and scripting tasks are integral components of this position.
If you are motivated, passionate, a quick learner, and have outstanding data engineering skills, this role is waiting for you!
Data Modeling
Develop and implement comprehensive and scalable data models that align with business requirements and objectives.
Collaborate with data architects and analysts/information engineers to understand data needs, ensuring data models are optimized for performance and analytical use.
Regularly review and enhance existing data models to accommodate evolving business requirements and ensure long-term sustainability.
ETL Development
Design, develop, and deploy robust ETL processes to extract, transform, and load data from diverse sources into the data lake platform.
Gather and understand data integration requirements with business stakeholders; ensure ETL workflows meet organizational needs.
Monitor and troubleshoot ETL processes to maintain data integrity and minimize downtime.
Manage and configure data processing workflows orchestration using an enterprise scheduler like AWS Airflow.
Framework, Automation & Standardization
Leverage Python and SQL to construct a framework for source data extraction, transformation, and loading tasks; craft scripts using PowerShell and Unix shell to enable end-to-end automated pipelines.
Build reusable scripts to automate code deployment (CI/CD) across environments.
Develop scripts to manage infrastructure as code (IaaC) using Python/Terraform or AWS CloudFormation.
Implement data governance policies and practices to ensure data accuracy, consistency, and security.
Participate in data quality improvement initiatives and provide guidance on best practices.
Implement solutions to automate code reviews based on organizational best practices and standardization guidelines.
Education and/or Experience
Bachelor’s Degree in Computer Science or equivalent work experience
7+ years of professional development experience in enterprise-scale data warehousing, data engineering and data lake solutions.
3+ years of experience in Data Modeling (Dimensional Modeling, Normalized models, etc.).
Knowledge of ETL/ELT tools (AWS Glue, DBT, SSIS, Apache Spark, Informatica).
3+ years of hands-on experience with modern columnar data platforms (Snowflake, AWS Redshift, Azure Synapse).
Extensive experience with SQL/PLSQL scripting, dynamic programming & performance tuning.
3+ years using Python for data engineering solutions.
Good knowledge of data pipeline orchestration tools like Airflow, Control-M, AutoSys.
3+ years of experience on cloud platforms (preferably AWS) with related data engineering services like Glue, S3, Lambda, CloudWatch, Parameter Store, MWAA.
Strong understanding of core infrastructure components (servers, network, storage).
Experience with Git in a team environment and agile development.
Skills / Specialized Knowledge
Excellent communication, presentation, and interpersonal skills
Ability to thrive under pressure in a fast-paced environment
Strong attention to detail and accuracy
Ability to collaborate and build consensus
Continuous learner with interest in current and emerging technologies
Experience building analytics solutions for Sales, Finance, Product, Operations, and Marketing in an enterprise
Experience managing, measuring, and improving data quality in a data warehouse
Experience with Salesforce and Yardi in real estate domain is highly desirable
Experience in large teams using CI/CD and agile methodologies
Other Requirements
N/A
Work Environment
Standard office environment; may be busy and noisy at times.
Salary Range
The salary range for this position is $118,800.00 - $205,920.00, plus eligibility for an annual discretionary bonus. Actual compensation within the range depends on skills, experience, location, and applicable laws.
Compensation and Benefits
Annual bonus program
Health, dental, vision, and life insurance
Long-term and short-term disability insurance
Generous paid time off including vacation, sick time, holidays and floating holidays
401(k) with company matching
Casual work environment
Team events and gatherings
Invitation Homes is an equal opportunity employer committed to fostering a diverse, inclusive and innovative environment. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, Veteran status or any other protected status. If you require accommodations, please contact humanresources@invitationhomes.com.
To all recruitment agencies: Invitation Homes does not accept agency resumes. Please do not forward resumes to Invitation Homes employees. Invitation Homes is not responsible for any fees related to unsolicited resumes.
#J-18808-Ljbffr
Invitation Homes is nation’s premier home leasing company, pioneering a new industry supported by advanced and robust technology solutions to enhance the resident experience. We are looking for innovative, dynamic individuals who are passionate about building business focused technology solutions using best of the breed tech stacks and take the platform to the next level.
The Data Architect will play a key role in designing, implementing, and maintaining robust data infrastructure to support our organization's data-driven initiatives. Candidate should have a strong background in data engineering, with expertise in data modeling, data processing, python programming, and database SQL/PLSQL scripting encompassing procedures, functions, dynamic programming etc. This role is crucial in shaping the architecture of our data platform solutions by building and optimizing data models, and standardizing and implementing efficient ETL processes. Furthermore, active involvement in database management, ensuring data quality and governance, and harnessing programming skills for automation and scripting tasks are integral components of this position.
If you are motivated, passionate, a quick learner, and have outstanding data engineering skills, this role is waiting for you!
Data Modeling
Develop and implement comprehensive and scalable data models that align with business requirements and objectives.
Collaborate with data architects and analysts/information engineers to understand data needs, ensuring data models are optimized for performance and analytical use.
Regularly review and enhance existing data models to accommodate evolving business requirements and ensure long-term sustainability.
ETL Development
Design, develop, and deploy robust ETL processes to extract, transform, and load data from diverse sources into the data lake platform.
Gather and understand data integration requirements with business stakeholders; ensure ETL workflows meet organizational needs.
Monitor and troubleshoot ETL processes to maintain data integrity and minimize downtime.
Manage and configure data processing workflows orchestration using an enterprise scheduler like AWS Airflow.
Framework, Automation & Standardization
Leverage Python and SQL to construct a framework for source data extraction, transformation, and loading tasks; craft scripts using PowerShell and Unix shell to enable end-to-end automated pipelines.
Build reusable scripts to automate code deployment (CI/CD) across environments.
Develop scripts to manage infrastructure as code (IaaC) using Python/Terraform or AWS CloudFormation.
Implement data governance policies and practices to ensure data accuracy, consistency, and security.
Participate in data quality improvement initiatives and provide guidance on best practices.
Implement solutions to automate code reviews based on organizational best practices and standardization guidelines.
Education and/or Experience
Bachelor’s Degree in Computer Science or equivalent work experience
7+ years of professional development experience in enterprise-scale data warehousing, data engineering and data lake solutions.
3+ years of experience in Data Modeling (Dimensional Modeling, Normalized models, etc.).
Knowledge of ETL/ELT tools (AWS Glue, DBT, SSIS, Apache Spark, Informatica).
3+ years of hands-on experience with modern columnar data platforms (Snowflake, AWS Redshift, Azure Synapse).
Extensive experience with SQL/PLSQL scripting, dynamic programming & performance tuning.
3+ years using Python for data engineering solutions.
Good knowledge of data pipeline orchestration tools like Airflow, Control-M, AutoSys.
3+ years of experience on cloud platforms (preferably AWS) with related data engineering services like Glue, S3, Lambda, CloudWatch, Parameter Store, MWAA.
Strong understanding of core infrastructure components (servers, network, storage).
Experience with Git in a team environment and agile development.
Skills / Specialized Knowledge
Excellent communication, presentation, and interpersonal skills
Ability to thrive under pressure in a fast-paced environment
Strong attention to detail and accuracy
Ability to collaborate and build consensus
Continuous learner with interest in current and emerging technologies
Experience building analytics solutions for Sales, Finance, Product, Operations, and Marketing in an enterprise
Experience managing, measuring, and improving data quality in a data warehouse
Experience with Salesforce and Yardi in real estate domain is highly desirable
Experience in large teams using CI/CD and agile methodologies
Other Requirements
N/A
Work Environment
Standard office environment; may be busy and noisy at times.
Salary Range
The salary range for this position is $118,800.00 - $205,920.00, plus eligibility for an annual discretionary bonus. Actual compensation within the range depends on skills, experience, location, and applicable laws.
Compensation and Benefits
Annual bonus program
Health, dental, vision, and life insurance
Long-term and short-term disability insurance
Generous paid time off including vacation, sick time, holidays and floating holidays
401(k) with company matching
Casual work environment
Team events and gatherings
Invitation Homes is an equal opportunity employer committed to fostering a diverse, inclusive and innovative environment. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, Veteran status or any other protected status. If you require accommodations, please contact humanresources@invitationhomes.com.
To all recruitment agencies: Invitation Homes does not accept agency resumes. Please do not forward resumes to Invitation Homes employees. Invitation Homes is not responsible for any fees related to unsolicited resumes.
#J-18808-Ljbffr