Why Work With Us?
100% Remote Flexibility
Work from anywhere in India with complete flexibility and work-life balance.
Cutting-Edge Projects
Build enterprise solutions for leading healthcare and technology organizations.
Growth-Oriented Culture
Continuous learning, training, and career development opportunities.
Collaborative Environment
Work alongside expert professionals who value innovation and teamwork.
Immediate Impact
Your work drives real business outcomes and technological transformation.
Referral Rewards
Earn bonus rewards for successful referrals through our employee program.
Current Openings
We're actively hiring mid-level professionals (3-5 years experience) for immediate joining.
Databricks Developer
About the Role
Join HiFour's data engineering team to build scalable data solutions using Databricks. You'll design and implement robust data pipelines that power analytics and business intelligence for our enterprise clients.
Key Responsibilities
- Design, develop, and optimize ETL pipelines using Databricks platform
- Build and maintain data lakehouse architectures for enterprise-scale data processing
- Implement data transformations using PySpark, SQL, and Python
- Collaborate with data scientists and analysts to deliver actionable insights
- Ensure data quality, security, and governance across all data workflows
Required Skills
- Strong proficiency in Databricks platform and Apache Spark
- Expertise in SQL, Python, and PySpark
- Experience with ETL/ELT pipelines and data modeling
- Understanding of data lakehouse architecture and Delta Lake
- Knowledge of cloud platforms (AWS/Azure)
Databricks DBA
About the Role
We're seeking a skilled Databricks DBA to manage and optimize our cloud data platform infrastructure. You'll ensure high performance, security, and reliability of our Databricks environments supporting critical business operations.
Key Responsibilities
- Administer and maintain Databricks workspaces, clusters, and compute resources
- Implement security policies, access controls, and compliance measures
- Monitor and optimize cluster performance, cost, and resource utilization
- Troubleshoot and resolve platform issues to ensure high availability
- Establish best practices for platform usage and maintenance
Required Skills
- Strong expertise in Databricks administration and management
- Experience with cloud platforms (AWS/Azure) and their data services
- Knowledge of performance tuning and optimization techniques
- Understanding of security, compliance, and governance frameworks
- Proficiency in SQL, Python, and shell scripting
Data Engineer (AWS)
About the Role
Be part of our data engineering team building scalable AWS-based data solutions. You'll design and implement data pipelines, warehouses, and analytics platforms that drive business intelligence and decision-making.
Key Responsibilities
- Design and build ETL pipelines using AWS Glue, Lambda, and Step Functions
- Develop data warehousing solutions using Amazon Athena and Redshift
- Implement data lake architectures on Amazon S3
- Create interactive dashboards and reports using AWS QuickSight
- Write Python scripts for data processing, validation, and automation
Required Skills
- Strong expertise in AWS Glue, Athena, S3, and QuickSight
- Proficiency in Python for data engineering tasks
- Experience with SQL and data modeling
- Knowledge of data warehouse concepts and best practices
- Understanding of AWS security and IAM policies
DevOps Engineer (Azure)
About the Role
Join our DevOps team to build and maintain robust cloud infrastructure on Azure. You'll implement GitOps practices, automate deployments, and ensure reliable, scalable systems for our enterprise applications.
Key Responsibilities
- Design and manage Azure cloud infrastructure using Terraform
- Implement CI/CD pipelines using GitHub Actions and GitOps workflows
- Deploy and manage containerized applications using Kubernetes and Helm Charts
- Set up monitoring, logging, and alerting solutions
- Automate infrastructure provisioning and configuration management
Required Skills
- Strong expertise in Microsoft Azure services (AKS, VMs, Networking)
- Proficiency in Terraform for infrastructure as code
- Experience with Kubernetes, Docker, and Helm Charts
- Knowledge of GitOps principles and GitHub/GitHub Actions
- Strong scripting skills (Bash, PowerShell, Python)
DevOps Engineer (AWS)
About the Role
We're looking for an AWS DevOps Engineer to build and maintain scalable cloud infrastructure. You'll implement GitOps workflows using GitLab, automate deployments, and ensure high availability of our AWS-based systems.
Key Responsibilities
- Design and manage AWS cloud infrastructure using Terraform
- Implement CI/CD pipelines using GitLab CI/CD and GitOps practices
- Deploy and orchestrate containerized applications using EKS and Helm Charts
- Configure monitoring, logging, and alerting solutions
- Optimize cloud costs and resource utilization
Required Skills
- Strong expertise in AWS services (EKS, EC2, VPC, S3, RDS)
- Proficiency in Terraform for infrastructure as code
- Experience with Kubernetes, Docker, and Helm Charts
- Knowledge of GitOps principles and GitLab/GitLab CI/CD
- Strong scripting skills (Bash, Python)
Frontend UI Developer
About the Role
Join our frontend team to create beautiful, responsive web applications. You'll work with modern technologies like Next.js and React to build user interfaces that delight our clients and their users.
Key Responsibilities
- Develop responsive web applications using Next.js and React
- Implement pixel-perfect UI designs using Tailwind CSS
- Write clean, type-safe code using TypeScript
- Build reusable components and maintain component libraries
- Integrate frontend applications with REST APIs and GraphQL
Required Skills
- Strong expertise in Next.js and React
- Proficiency in TypeScript and modern JavaScript (ES6+)
- Experience with Tailwind CSS and responsive design
- Understanding of state management (Redux, Zustand, Context API)
- Knowledge of RESTful APIs and asynchronous programming
Backend Developer (Python)
About the Role
We're seeking talented Python developers to build scalable backend systems and APIs. You'll work with modern frameworks and event-driven architectures to create robust solutions for our enterprise clients.
Key Responsibilities
- Design and develop RESTful APIs using FastAPI and Python
- Build microservices architecture and event-driven systems
- Implement message queuing and streaming using Apache Kafka
- Develop database schemas and optimize query performance
- Write clean, maintainable, and well-documented code
Required Skills
- Strong expertise in Python and FastAPI (or similar frameworks)
- Experience with Apache Kafka or other message brokers
- Proficiency in SQL and NoSQL databases (PostgreSQL, MongoDB, Redis)
- Understanding of microservices architecture and design patterns
- Knowledge of RESTful API design and best practices
Ready to Join Our Team?
Send your resume and let's start a conversation about your future with HiFour.
hr@hifour.io