DevOps / DataOps Engineer – Baku, Azerbaijan
Olmait is working on innovative and challenging projects and is building dedicated teams in Data science, data-related software, and cloud infrastructures for startups internationally. We work in industries and fields of expertise that include healthcare, cybersecurity, e-commerce, and more. We operate from Israel and Georgia, hosting one of the strongest dev teams in the region. We offer you a pleasant work environment with exceptional opportunities for professional growth.
Olmait is pleased to announce that it is looking for a DevOps / DataOps Engineer for an Israeli startup that revolutionizes data science. Its automated data and feature discovery platform connect its customers to thousands of data sources. The solution built by the startup empowers organizations across different industries, such as financial services, retail and e-commerce, consumer packaged goods, healthcare, marketing, human resources, and the public sector.
This position “DataOps” is actually a DevOps Engineer having an expertise and experience with data infrastructure supporting mostly BigData, ETL and Machine Learning (ML) use cases.
- Help us build one of the most complex knowledge systems in the world.
- Work closely with engineering in building an in house data infrastructure, working on low latency, high throughput production systems.
- Work closely with DevOps in managing infrastructure, from POC to production.
- Minimum 2 years of experience as DevOps engineer with focus on (preferably cloud) data infrastructure (BigData, ML pipelines, ETL, DWH, …)
- Solid background and experience in Linux/Unix
- Hands-on experience in monitoring production critical systems.
- Ability to define and manage DevOps workflow
- Hands-on production experience with Elasticsearch - Must.
- Deep understanding of modern data stack: orchestration, distributed systems, and cloud infrastructure and services, mainly AWS and GCP.
- Experience in with Docker and Kubernetes environments and data intensive systems, focus on K8s, Kafka, Spark, and alike.
- Experience with open source monitoring system such as Prometheus, Graphana
- Experience with python applications and python programming
- Experience with Ray or Kubeflow to handle ML pipelines.
- Pleasant and flexible work environment
- Working on data, AI and cloud project – the hottest cutting edge topics in today’s technology world
- Working with American and Israeli clients – the world leaders in data and AI
- Challenge to innovate and find creative solutions for clients’ needs
- Very competitive remuneration for the region
- Health insurance
- Coverage of professional certification costs
To apply, please send your CV to [email protected] indicating “DevOps / DataOps Engineer-1013” in the email subject. For prompt consideration please try to clearly explain in your CV how you meet technical and other requirements specified above.