b_labs is the transformation and digitization arm of B.TECH, on a mission to change the industry as we know it. We spearhead tech innovation at B.TECH, an organization that has been a cornerstone of the Egyptian retail industry. We are the engine powering B.TECH in achieving its goal of becoming the leading omni-channel platform for consumer electronics and appliances in Egypt. By joining b labs, you will get to benefit from a collaborative startup environment, while also enjoying the stability of working within a company that has achieved strong growth year after year. You will be part of a visionary, customer-focused team with an ambitious mission to become a trailblazer for digital retail within the Middle East.
Responsibilities
- Design, develop, and maintain large-scale, reliable data pipelines using Python, SQL, and big data technologies such as Apache Spark and Kafka.
- Build and optimize ETL / ELT processes for data transformation, loading, and integration from multiple sources.
- Develop and maintain data storage solutions using both relational and NoSQL databases, including SQL Server, PostgreSQL, MySQL, and MongoDB.
- Implement and manage CI / CD pipelines for data workflows, enabling automated deployments and version control.
- Work with AWS services to build, deploy, and monitor cloud-based, scalable data solutions.
- Leverage Apache Airflow for orchestrating workflows and PostHog for analytics tracking and event data.
- Manage and enhance data warehousing solutions to support business intelligence and analytics needs.
- Ensure data accuracy, consistency, and security across diverse systems and sources.
- Troubleshoot and optimize data systems for performance, scalability, and cost efficiency.
- Actively promote and contribute to a collaborative, innovative, and agile team
Requirements
5+ years of experience in data engineering, building and maintaining production-grade data pipelines and architectures.Proficient in Python and SQL .Hands-on with relational databases ( SQL Server , PostgreSQL , MySQL ) and NoSQL ( MongoDB ).Experience with big data and stream processing tools (e.g., Apache Spark , Kafka ).Skilled in implementing CI / CD pipelines for data workflows.Strong understanding of AWS services (S3, Redshift, Lambda, Glue).Experience with Apache Airflow for workflow orchestration.Familiarity with PostHog or Amplitude for analytics tracking and event management.Comfortable with Docker , Kubernetes , and Linux shell scripting .Solid grasp of data modeling , warehousing , scalability, and reliability best practices.Proven ability to ensure data quality , governance , and security .Strong communication skills and a collaborative mindset.Passion for continuous learning and staying updated on emerging technologies.Benefits
Office environment : When you come to our b_labs office, you'll find creative workspaces and an open design to foster collaboration between teams.
Flexibility : You know best whether you want to work from home or in the office.
Equipment : From "Day 1" you will receive all the equipment you need be successful at work.