ArcRemote

Python Developer (PyTantic AI/GCP) - PT Freelance - Worldwide

Project-Based

Description

Role Overview

We are seeking a skilled Python Developer with a focus on Pydantic integration and Google Cloud Platform (GCP) expertise. This role involves developing and maintaining MLOps pipelines, managing cloud infrastructure, and ensuring the seamless deployment of AI models. Join us in leveraging cutting-edge technologies to optimize machine learning processes on a global scale.

Responsibilities

  • MLOps Pipeline Development: Design, implement, and maintain automated machine learning pipelines for the training, validation, testing, and deployment of AI models.
  • Infrastructure Management: Use Infrastructure as Code (IaC) principles, such as Terraform and Pulumi, to provision and optimize GCP infrastructure, ensuring efficient GPU allocation and cost management.
  • Containerization & Orchestration: Package ML models and their dependencies with Docker, and deploy on Kubernetes (GKE) clusters to ensure scalability and portability.
  • CI/CD Implementation: Establish Continuous Integration and Continuous Deployment workflows for model code and artifacts, facilitating rapid and secure updates.
  • Monitoring & Observability: Implement comprehensive monitoring, logging, and alerting systems using tools like Prometheus, Grafana, and GCP Stackdriver to track performance and system health.
  • DataOps & Versioning: Collaborate with data engineers to ensure reliable data flow, and implement dataset and feature version control with tools like DVC.
  • Collaboration: Work alongside data scientists to translate research models into production-ready microservices and APIs.
  • Pydantic Integration: Utilize Pydantic for data validation and settings management to maintain data quality and consistency.
  • Security & Compliance: Ensure ML systems adhere to security best practices, including access controls and data regulations.

Required Skills

  • Experience: Proven experience in MLOps, DevOps, software engineering, or an AI platform role, focusing on deploying and managing ML models.
  • Programming Proficiency: Expertise in Python, with experience in libraries like FastAPI, TensorFlow, PyTorch, and scikit-learn.
  • Cloud & Kubernetes Expertise: Strong hands-on experience with Google Cloud Platform and Kubernetes (GKE).
  • MLOps Tools: Familiarity with frameworks and tools such as MLflow, Kubeflow, and Vertex AI.
  • Software Engineering Best Practices: Solid understanding of the software development lifecycle, testing, and debugging.
  • Problem-Solving: Strong analytical skills for troubleshooting complex distributed systems.
  • Education: Bachelor's or Master's degree in Computer Science, Engineering, Data Science, or a related field.

Nice to Have

  • Experience with Generative AI or Large Language Model solutions, including Retrieval-Augmented Generation pipelines.
  • Knowledge of big data technologies like Apache Spark or Kafka.
  • GCP or Kubernetes certifications.

Skills

TerraformDockerContinuous IntegrationMachine LearningPyTorchPrometheusGCPApacheTensorFlowScikit-LearnFastAPISecurityPythonMLGrafanaAIMicroservicesMLflowscikit-learnCI/CDKafkaDevOpsContinuous DeploymentSparkData ScienceComplianceKubeflowPulumiApache SparkKubernetes