UNICOM Technologies Inc
Education and Experience
Must hold a BTech in Computer Science, a 4‑year degree with major in Computer Science.
Minimum of 5 years hands‑on data engineering experience using distributed computing approaches (Spark, Hadoop, Databricks).
Proven track record of successfully designing and implementing cloud‑based data solutions in Azure.
Deep understanding of data modeling concepts and techniques.
Strong proficiency with relational and non‑relational database systems.
Qualifications
Azure Cloud Data Solutions (Data Lake, Databricks, distributed systems) – 5+ years.
AI/ML Architecture (end‑to‑end pipeline: ingestion → training → deployment → monitoring).
Data Engineering (Spark, Hadoop, MapReduce).
Data Modeling & DB Systems (relational & NoSQL).
Preferred Qualifications
Advanced knowledge of cloud‑specific data services (e.g., DataBricks, Azure Data Lake).
Expertise in big data technologies (e.g., Hadoop, Spark).
Strong understanding of data security and governance principles.
Experience in scripting languages (Python, SQL).
Additional Skills
Exemplary written and verbal communication skills to collaborate effectively with all teams and stakeholders.
Outstanding analytical and problem‑solving skills for complex data challenges.
Ability to work effectively in cross‑functional teams and demonstrate potential for technical leadership.
Job Details
Seniority level: Mid‑Senior level
Employment type: Contract
Job function: Information Technology
Industry: IT Services and IT Consulting
Location: San Jose, CA
Salary: $210,600 – $305,100
#J-18808-Ljbffr
Minimum of 5 years hands‑on data engineering experience using distributed computing approaches (Spark, Hadoop, Databricks).
Proven track record of successfully designing and implementing cloud‑based data solutions in Azure.
Deep understanding of data modeling concepts and techniques.
Strong proficiency with relational and non‑relational database systems.
Qualifications
Azure Cloud Data Solutions (Data Lake, Databricks, distributed systems) – 5+ years.
AI/ML Architecture (end‑to‑end pipeline: ingestion → training → deployment → monitoring).
Data Engineering (Spark, Hadoop, MapReduce).
Data Modeling & DB Systems (relational & NoSQL).
Preferred Qualifications
Advanced knowledge of cloud‑specific data services (e.g., DataBricks, Azure Data Lake).
Expertise in big data technologies (e.g., Hadoop, Spark).
Strong understanding of data security and governance principles.
Experience in scripting languages (Python, SQL).
Additional Skills
Exemplary written and verbal communication skills to collaborate effectively with all teams and stakeholders.
Outstanding analytical and problem‑solving skills for complex data challenges.
Ability to work effectively in cross‑functional teams and demonstrate potential for technical leadership.
Job Details
Seniority level: Mid‑Senior level
Employment type: Contract
Job function: Information Technology
Industry: IT Services and IT Consulting
Location: San Jose, CA
Salary: $210,600 – $305,100
#J-18808-Ljbffr