Latitude Technolabs Pvt. Ltd.
Python Data Science with AI and LLM
Latitude Technolabs Pvt. Ltd., Indiana, Pennsylvania, us, 15705
Latitude Technolabs is a digital solutions company based in Ahmedabad, India, offering a range of services including mobile and web development, ERP solutions, UI/UX design, and digital marketing. The company has a global presence with offices in the USA, Switzerland, and Australia, and has completed over 500 web applications and 100 mobile apps across various industries.
Job Role: We are hiring an experienced Data Scientist with expertise in Python, AI/ML, LLMs, and Generative AI, who can design and deploy intelligent systems using both traditional and modern AI techniques. The ideal candidate will be able to communicate effectively with clients and stakeholders, understand business needs, and deliver scalable AI solutions.
Role + Responsibilities:
Design and implement machine learning models using supervised (e.g., regression, classification) and unsupervised (e.g., clustering, dimensionality reduction) techniques.
Build and fine‑tune LLMs (e.g., GPT, LLaMA, Mistral, Falcon, Qwen) for domain‑specific applications.
Develop and deploy Generative AI use cases such as text summarization, Q&A systems, chatbots, content generation, etc.
Work with Agentic AI frameworks like LangChain, LlamaIndex, and build RAG pipelines integrated with vector stores (FAISS, Pinecone, Chroma).
Preprocess and analyze large datasets using feature engineering and statistical methods.
Implement model training, validation, testing, and performance tuning.
Communicate clearly with clients to understand requirements, explain models, and present results.
Collaborate with cross‑functional teams including engineering and product to productionize models, monitor and retrain models based on feedback or data drift.
Person Specification and Qualifications:
MCA or equivalent degree in Computer Science, Data Science, or related field.
3+ years of hands‑on experience in data science and machine learning.
Strong understanding of supervised learning algorithms (Logistic Regression, Random Forest, XGBoost, SVM, etc.) and unsupervised methods (KMeans, DBSCAN, PCA, etc.).
Proficiency in Python and libraries such as pandas, NumPy, scikit‑learn, matplotlib, etc.
Experience with deep learning frameworks like PyTorch or TensorFlow.
Familiarity with LLMs and libraries like Hugging Face Transformers.
Solid grounding in NLP, text embeddings, and RAG architecture.
Exposure to LangChain, LlamaIndex, or agentic AI frameworks.
Working knowledge of vector databases (FAISS, Chroma, Pinecone).
Strong communication skills and comfort in interacting with clients/stakeholders.
Ability to work independently and collaboratively in a fast‑paced environment.
Hands‑on experience with cloud deployment across major platforms like AWS (including SageMaker), Google Cloud Platform (GCP), and Microsoft Azure.
Familiarity with model deployment pipelines, containerization (Docker), and cloud‑native APIs for scalable machine learning services.
Proven client communication skills, including requirement gathering, technical demos, progress reporting, and post‑deployment support.
Experience collaborating with cross‑functional teams, including product managers, data engineers, and other stakeholders.
Apply by sending your resume to
talentacquisition@latitudetechnolabs.org .
#J-18808-Ljbffr
Job Role: We are hiring an experienced Data Scientist with expertise in Python, AI/ML, LLMs, and Generative AI, who can design and deploy intelligent systems using both traditional and modern AI techniques. The ideal candidate will be able to communicate effectively with clients and stakeholders, understand business needs, and deliver scalable AI solutions.
Role + Responsibilities:
Design and implement machine learning models using supervised (e.g., regression, classification) and unsupervised (e.g., clustering, dimensionality reduction) techniques.
Build and fine‑tune LLMs (e.g., GPT, LLaMA, Mistral, Falcon, Qwen) for domain‑specific applications.
Develop and deploy Generative AI use cases such as text summarization, Q&A systems, chatbots, content generation, etc.
Work with Agentic AI frameworks like LangChain, LlamaIndex, and build RAG pipelines integrated with vector stores (FAISS, Pinecone, Chroma).
Preprocess and analyze large datasets using feature engineering and statistical methods.
Implement model training, validation, testing, and performance tuning.
Communicate clearly with clients to understand requirements, explain models, and present results.
Collaborate with cross‑functional teams including engineering and product to productionize models, monitor and retrain models based on feedback or data drift.
Person Specification and Qualifications:
MCA or equivalent degree in Computer Science, Data Science, or related field.
3+ years of hands‑on experience in data science and machine learning.
Strong understanding of supervised learning algorithms (Logistic Regression, Random Forest, XGBoost, SVM, etc.) and unsupervised methods (KMeans, DBSCAN, PCA, etc.).
Proficiency in Python and libraries such as pandas, NumPy, scikit‑learn, matplotlib, etc.
Experience with deep learning frameworks like PyTorch or TensorFlow.
Familiarity with LLMs and libraries like Hugging Face Transformers.
Solid grounding in NLP, text embeddings, and RAG architecture.
Exposure to LangChain, LlamaIndex, or agentic AI frameworks.
Working knowledge of vector databases (FAISS, Chroma, Pinecone).
Strong communication skills and comfort in interacting with clients/stakeholders.
Ability to work independently and collaboratively in a fast‑paced environment.
Hands‑on experience with cloud deployment across major platforms like AWS (including SageMaker), Google Cloud Platform (GCP), and Microsoft Azure.
Familiarity with model deployment pipelines, containerization (Docker), and cloud‑native APIs for scalable machine learning services.
Proven client communication skills, including requirement gathering, technical demos, progress reporting, and post‑deployment support.
Experience collaborating with cross‑functional teams, including product managers, data engineers, and other stakeholders.
Apply by sending your resume to
talentacquisition@latitudetechnolabs.org .
#J-18808-Ljbffr