Logo
Jobs via Dice

AWS Data Architect

Jobs via Dice, Torrance, California, United States, 90504

Save Job

AWS Data Architect Location:

Torrance, CA (Onsite)

Duration:

Contract

Key Responsibilities

Data Architecture Design:

Architect and implement a scalable data hub solution on AWS using best practices for data ingestion, transformation, storage, and access control.

Define data models, data lineage, and data quality standards for the DataHub.

Select appropriate AWS services (S3, Glue, Redshift, Athena, Lambda) based on data volume, access patterns, and performance requirements.

Come up with a design that accommodates AI/ML applications in the next phase.

Data Ingestion and Integration:

Design and build data pipelines to extract, transform, and load data from various sources (databases, APIs, flat files) into the DataHub using AWS Glue, AWS Batch, or custom ETL processes.

Implement data cleansing and normalization techniques to ensure data quality.

Manage data ingestion schedules and error handling mechanisms.

Data Governance and Access Control:

Establish data access controls and security policies to protect sensitive data within the DataHub using IAM roles and policies.

Develop data governance frameworks including data quality checks, data lineage tracking, and data retention policies.

Data Analytics Enablement:

Create data catalogs and metadata management systems to facilitate data discovery and understanding by business users and data analysts.

Design and implement data views and dashboards using Power BI to enable data exploration and visualization.

Create data warehouses and data marts to meet the needs of the business.

Monitoring and Optimization:

Monitor data pipeline performance, data quality, and system health to identify and resolve issues proactively.

Optimize data storage and processing costs by leveraging AWS cost optimization features.

Data Exchange:

Develop the required governance, security, monitoring and guardrails to enable efficient data exchange between internal applications and external vendors, partners, and SaaS providers.

Develop intake process, SLAs, and usage rules for internal and external data set producers and consumers.

Required Skills and Experience

AWS expertise: deep understanding of S3, Glue, Redshift, Athena, Lake Formation, and EventBridge, etc.

Data modeling: proficiency in dimensional and snowflake data models for warehousing and data lakes.

Data engineering: experience with ETL/ELT processes, data cleansing, transformation, and quality checks; experience with Informatica IICS and ICDQ is a plus.

Programming: proficiency in Python, SQL, and potentially PySpark.

Data governance: knowledge of data classification, access control, and lineage tracking.

Contact

Lalit Kumar Sr. Technical Recruiter Envision Technology Solutions

#J-18808-Ljbffr