Робота для біженців з України
Senior Software Engineer
 
All Jobs All Ads by this Client Printer Friendly
Ad title: Senior Software Engineer
Provided By: Holovo
Published: 03 June / Deadline: 03 July
HOLOVO is pleased to announce an opening for the position of Senior Software Engineer.

Job Type: Full-Time
Experience Level: Senior (7+ years)

About Us:

HOLOVO is a next-generation agency designed to empower businesses and professionals to thrive in the AI age. Our mission is to make artificial intelligence accessible, practical, and genuinely beneficial — for organizations of all sizes and for individuals eager to embrace innovation. We specialize in three areas: tailored AI-driven business solutions, fast and innovative AI-powered design services, and educational programs that demystify AI for businesses and individuals alike.

At HOLOVO, we believe in ethical innovation, authentic partnerships, and data-driven decision-making. We are trusted advisors, visionary collaborators, and enthusiastic partners, passionate about helping our clients unlock new opportunities in a fast-changing world.

About the Role:

We are seeking a highly skilled Senior Software Engineer to join our growing team. This role demands technical depth, strategic thinking, and a hands-on approach to building scalable solutions that power our data and ML infrastructure. The ideal candidate will bring strong software engineering expertise, with a focus on API integration, data engineering, DevOps and MLOps practices.

Key Responsibilities:

** API Development & Integration: Build, maintain and secure backend services and integrations, with a strong emphasis on working with Google APIs (e.g., Sheets, Drive, Maps). You`ll be responsible for handling authentication flows, request optimization, error handling and ensuring reliable communication between systems.
** Data Pipeline Engineering with PySpark: Design and develop scalable data pipelines using PySpark to process large volumes of data from diverse sources. This includes managing both batch and near-real-time data workflows, as well as optimizing jobs for performance, cost and maintainability.
** Redshift Data Management & Optimization: Work with Amazon Redshift as the primary data warehouse. Responsibilities include writing complex SQL for analytics use cases, optimizing queries, managing schema design, handling large table operations and integrating Redshift with upstream/downstream systems.
** MLOps & SageMaker Integration: Collaborate with data scientists to support the deployment and operationalization of machine learning models using AWS SageMaker. You`ll help implement MLOps practices, including model versioning, retraining pipelines, automated deployment, monitoring and rollback strategies.
** DevOps & Cloud Infrastructure: Manage and maintain AWS environments, including account structure, IAM roles/policies, cost monitoring and infrastructure automation using tools like Terraform or CloudFormation. You`ll also contribute to CI/CD workflows and ensure robust monitoring and logging are in place across services.
** Data Visualization & Reporting (Bonus): While not required, experience working with Power BI to support business reporting needs or integrate with backend systems is a strong plus.

Required Skills & Qualifications:

** 7+ years of professional software engineering experience.
** Strong experience with PySpark and data processing frameworks.
** Proficiency with integrating Google APIs (e.g., Google Sheets, Drive, Maps, etc.).
** Experience working with AWS cloud services, including EC2, S3, IAM, Glue, Redshift and SageMaker.
** Solid understanding of MLOps practices, such as model versioning, monitoring and automated retraining.
** Expertise in DevOps practices, including infrastructure-as-code (Terraform/CDK), CI/CD (GitHub Actions, Jenkins) and cost/account management in AWS.
** Strong understanding of database systems, including schema design and query optimization.
** Familiarity with BI tools such as Power BI, Tableau or similar (Power BI preferred).
** Proficient in Python and familiar with other modern programming languages (e.g., Java, Scala).
** Strong communication skills and the ability to work effectively in cross-functional teams.

Nice to Have:

** Experience with containerization tools like Docker and orchestration systems like Kubernetes.
** Knowledge of data lake architectures and tools like AWS Glue, Lake Formation.
** Familiarity with data governance, privacy, and security best practices.
** Prior experience in leading or architecting technical solutions for data-intensive applications.

Are you interested? Please send your CV in English to info@holovo.ai

Applications will be reviewed as they are received.
 
All Jobs Printer Friendly
Share on facebook