DevOps / DataOps Engineer – Baku, Azerbaijan

Olmait is working on innovative and challenging projects and is building dedicated teams in Data science, data-related software, and cloud infrastructures for startups internationally. We work in industries and fields of expertise that include healthcare, cybersecurity, e-commerce, and more. We operate from Israel and Georgia, hosting one of the strongest dev teams in the region. We offer you a pleasant work environment with exceptional opportunities for professional growth.

We are looking for a DevOps / DataOps Engineer to join a remote team of a cutting-edge Israeli data science company, who recently closed a Series C funding round, bringing their total funding to +100M USD. The company offers a revolutionary data science platform powered by augmented data discovery and feature engineering. By automatically connecting to thousands of external data sources and leveraging machine learning to distill the most impactful signals, their platform empowers business leaders and data scientists to drive decision-making by eliminating the barrier to acquire the right data and enabling superior predictive power.

This position “DataOps” is actually a DevOps Engineer having an expertise and experience with data infrastructure supporting mostly BigData, ETL and Machine Learning (ML) use cases.

 

Responsibilities:

  • Help us build one of the most complex knowledge systems in the world.
  • Work closely with engineering in building an in house data infrastructure, working on low latency, high throughput production systems.
  • Work closely with DevOps in managing infrastructure, from POC to production.

 

Requirements:

  • Minimum 2 years of experience as DevOps engineer with focus on (preferably cloud) data infrastructure (BigData, ML pipelines, ETL, DWH, …)
  • Solid background and experience in Linux/Unix
  • Hands-on experience in monitoring production critical systems.
  • Ability to define and manage DevOps workflow
  • Hands-on production experience with Elasticsearch - Must.
  • Deep understanding of modern data stack: orchestration, distributed systems, and cloud infrastructure and services, mainly AWS and GCP.
  • Experience in with Docker and Kubernetes environments and data intensive systems, focus on K8s, Kafka, Spark, and alike.
  • Experience with open source monitoring system such as Prometheus, Graphana

 

Bonus:

  • Experience with python applications and python programming
  • Experience with Ray or Kubeflow to handle ML pipelines.

 

We offer :

  • Pleasant work environment
  • Coverage of professional certification costs

 

To apply, please send your CV to [email protected] indicating “DevOps / DataOps Engineer - 1013”   in the email subject. For prompt consideration please try to clearly explain in your CV how you meet the technical and other requirements specified above.

 

    Please send us your English resume at [email protected] and indicate "DevOps / DataOps Engineer – Baku, Azerbaijan" in the email subject line.