About Me
Mujahid is a results-driven, customer-focused, articulate and analytical Big Data Engineer & Professional Services Consultant who can think “out of the box”. Strong in design and integration problem solving skills. Working in Big Data Hadoop, Spark, Hive,Teradata, Redshift, Kinesis, EMR, S3,EC2, RDS. Strong experience in open soruce technologies Apache Flink, Apache Airflow, Apache Kafka, Redis, Elasticsearch, Mysql, MongoDB, Docker. Development experience in Java, Python, Scala and C# with data processing, analysis and design. Skilled in developing business plans, requirements specifications, user documentation, and architectural systems. Strong written and verbal communications. Interested in Big Data Challenges .
- Name: Mujahid Niaz
- Phone: +49 1573 789 6168
- LinkedIn: https://www.linkedin.com/in/mujahidniaz
- City: Berlin, Germany
- Age:
- Email: mujahid.niaz002@gmail.com
- Github: https://www.github.com/mujahidniaz
- Degree: Bachelors in Computer Science
My Resume
I have 6 years of experience in Software & Big Data Engineering. During this period, I have worked for Fides Technologies, Teradata, Krieger Digital & Sonova Global.
Professional Experience
Sr. Specialist Solutions Engineer
January 2025 - Present
Databricks, Berlin, Germany
- Partner with customers to understand data challenges and align technical solutions with business objectives
- Lead design and implementation of Proof-of-Concepts showcasing Databricks capabilities across data engineering, analytics, and machine learning
- Demonstrate Unified Analytics Platform expertise (Apache Spark, Delta Lake, MLflow) across AWS, Azure, and GCP
- Deliver technical presentations, demos, and workshops to drive platform adoption
- Guide customers through successful platform adoption with training, best practices, and ongoing support
- Work closely with Presales, Sales, and Account Teams to enable data-driven innovation
Big Data Engineer
2023 - 2025
Sonova Global, Berlin, Germany
- Designed and maintained scalable data pipelines for efficient data collection, processing, and storage
- Implemented data transformation and enrichment processes ensuring high-quality, consistent analytics data
- Utilized Snowplow Analytics open-source pipeline for event-level customer interaction insights
- Architected centralized Data Hub for global data collection and reporting across AWS and Azure
- Monitored and optimized data pipeline performance for efficiency and cost-effectiveness
- Implemented CI/CD automation to streamline development and deployment workflows
Big Data Engineer
2020 - Present
Krieger Digital, Berlin, Germany
Working as Big Data Engineer and moslty responsible for
- Managing Upgrades and Services on AWS Cloud infrastructure
- Deploying and Managing Services using Helm Charts on Kubernetes
- Buidling and Managing ETL Data pipelines for multiple Data Sources to collect Data in Redshift
- Building and managing Real time data pipelines using open AWS Kinesis and Apache Flink technogies
- Contributing into development of Internal Data Warehouse
- Data Processing using Apache Spark (Scala + Python) and Pandas in Python
- Working on Kubernetese to automate deployments using Gitlab CI/CD
- Data Visualizations and Dashboards using Apache Superset
Global Delivery Ctr Consultant (Big Data Analytics)
2016 - 2020
Teradata Global Delivery Center, Islamabad, Pakistan
Worked at Teradata GDC Pakistan as Global Delivery Ctr Consultant where i had diverse experince in multiple domains due to working on multiple project in multiple domains Included following Major Projects :
- Data Migration between Teradata & Hadoop Cluster
- Building Big Data Pipelines with Kafka->Spark->HBase->Hive
- Building Real-Time data pipelines with Apache Kafka & Spark
- Teradata Azure Deployment Accelerator
- Setting up Teradata DWH (Vantage) on AWS.
- Application Deployement using Dockers Containers
- Teradata GDC Status Portal (.Net + SQL Server)
- Teradata Automations & Performance Enhacments with UDFs (Java)
- Software Development (Java,Cit, C++, Python,Scala,JavaScript)
Software Engineer (Machine Learning)
2015 - 2016
Fides Technologies, Islambad, Pakistan
I have worked with FIDES Technologies for 6 months worked as Software Engineer (Machine Learning) on project to create Customer Behavior Analysis using Data Mining Techniques. Helping Telecom Industry in one2one customer marketing and analyzing Telecom market trends & visualize in interactive graphical visualizations dashboards (D3.js)
Education
Bachelors in Computer Science
2012 - 2016
National University of Computer and Emerging Sciences, Islamabad, Pakistan
Majors: Data Mining, Information Retrieval, Concurrent & Distributed Computing, Digital Image Processing, Artificial Intelligence, Data Warehousing, Human Computer Interaction, Software Engineering, and Web Programming
Certifications & Trainings
Databricks Certified Data Engineer Associate
Certification validating skills in building and maintaining data pipelines using Databricks Lakehouse Platform. Demonstrates proficiency in data ingestion, transformation, optimization, and managing data workflows for scalable analytics.
Databricks Certified Generative AI Engineer Associate
Credential demonstrating knowledge of applying generative AI and large language models (LLMs) on the Databricks platform. Covers prompt engineering, fine-tuning, model deployment, and integration of AI workflows within the Lakehouse architecture.
Teradata 14 Certified Professional
Certification for setting up Teradata Data warehouse in a new environment from scratch
Teradata - Intro to Vantage
Trained for Teradata Vantage Analytics solution
DevOps Generalist
DevOps Basics and Essentials
Microsoft Certified: Azure Fundamentals
Intro to Microsoft Azure Services
AWS Business Professional
Accreditation for AWS services for business point of view
AWS Business Professional
Accreditation for AWS services for technical point of view
Awards & Recognitions
3 Years of Service Award (R&D) 2020 by Teradata Coroporations
Received recognition for my work in R&D field
Teradata Innovation Champion 2018
Received recognition for developing out of the box solution for Teradata Interal Tools for Cloud Deployments
2nd Position in App Development SIST 2019
Participated and won 2nd position in App development competition at SIST 2019