From configuring network devices by hand at Huawei to building 17-stage data pipelines at Infineon every role taught me that the biggest leverage is removing toil. Here's the path forward.
Network automation support intern at KICS/UET, Lahore. First exposure to scripting, troubleshooting, and runbook standardization. Manual processes everywhere and a growing desire to automate them.
Software Automation Intern at Infineon Technologies, Austria. Building EDA data pipelines (17-stage Medallion architecture), enhancing Jenkins CI/CD (~20% faster), integrating SonarQube quality gates, and working on my MSc thesis.
Full-stack AI & Automation Engineer designing intelligent systems that automate, learn, and scale. Combining software engineering, data pipelines, and ML to build platforms that make engineers more effective.
In every role I've held, from telecom operations in Pakistan to semiconductor engineering in Austria, I've seen the same pattern: talented engineers spending most of their time on repetitive, manual tasks. Copy-paste configurations. Manual deployments. No monitoring. No version control. Status quo maintained because "that's how we've always done it."
I decided early in my career that I wanted to be the person who breaks that cycle not by criticizing the process, but by building something better.
Engineers doing the same tasks by hand every day. No scripts, no automation, no templates. Hours lost to work that a well-written Python script could handle in seconds.
Critical know-how lived in people's heads, not in code or documentation. When someone left, the knowledge walked out with them. No runbooks, no shared understanding.
No CI/CD pipelines. Manual testing. Code shared via email or USB. Releases were rare, risky, and stressful events instead of routine, confident deployments.
Systems running without proper monitoring. Problems discovered when users complained, not when metrics spiked. Reactive firefighting instead of proactive prevention.
Data scattered across spreadsheets, file shares, and email threads. No structured pipelines. No data governance. Making decisions based on gut feeling instead of evidence.
Code written without documentation, without tests, without reviews. Technical debt accumulated silently until refactoring became impossible.
The problems weren't technical they were cultural. Organizations had the talent but lacked the automation mindset. My goal became simple: demonstrate that automation is not a threat to engineers, but a force multiplier that lets them focus on what matters.
At each stage of my career, I've built tools and systems that moved teams from manual to automated. Here's what's live and proven:
17-stage Medallion architecture (Bronze → Silver → Gold) for EDA data transformation. Structured, validated, and AI-ready data from fragmented engineering sources.
Enhanced build pipelines with modular stages, parallel execution, and artifact management. ~20% faster build-to-deploy cycles.
Code quality governance integrated as automated pipeline stages. Enforces standards on coverage, complexity, and code smells 15-25% less manual review.
Containerized build and deployment pipeline at Huawei. Jenkins + Docker achieving ~30% faster release cycles and reproducible environments.
Automated MPLS/VPN operational tasks with custom Python utilities. Reduced manual operational effort by ~40% across the team.
Anomaly detection, forecasting, and classification models deployed via FastAPI endpoints and Dockerized services. MLFlow for experiment tracking.
MPLS/VPN, SNMP, troubleshooting, runbook standardization
Jenkins, Docker, GitLab, SonarQube, Python automation
Medallion pipelines, SQL, data validation, structured ETL
PyTorch, Scikit-Learn, FastAPI, MLFlow, SDN
At every role, I follow the same approach: identify a manual process → understand it deeply → wrap it in Python → add tests and CI → deploy with monitoring. Then repeat for the next process. Each iteration builds on the last.
Work StyleManual troubleshooting, ticket-by-ticket resolution
ToolsBasic networking, SNMP, manual runbooks
DeploymentNo CI/CD. Manual uploads. Hope-driven releases
DataScattered logs, no dashboards, reactive monitoring
MindsetLearn everything, fix what's in front of me
Work StylePython-first automation, reducing manual effort by ~40%
ToolsJenkins, Docker, GitLab, SonarQube
DeploymentCI/CD pipelines, ~30% faster release cycles
DataPrometheus, Grafana, structured monitoring
MindsetAutomate first, then optimize. Start MSc in Communication Eng.
Work StyleAI-augmented engineering, data-driven decisions
ToolsPyTorch, MLFlow, FastAPI, Medallion pipelines
DeploymentQuality-gated CI/CD, SonarQube, continuous deployment
DataStructured pipelines: Bronze → Silver → Gold
MindsetBuild platforms, not scripts. Think in systems.
2018: Manual Operations
2023: DevOps & Automation
2025+: AI-Assisted Engineering
I am in the transition from DevOps automation to AI-assisted engineering. The foundation is solid CI/CD, quality gates, containerization, monitoring and now I'm layering data pipelines and machine learning on top. My MSc thesis on Automated Network Configuration Using SDN directly bridges these worlds.
My work sits at the intersection of DevOps, MLOps, and Data Engineering. Each role sharpened a different layer. Now the goal is to combine them into one platform-level skill set.
Jenkins pipelines, Docker, GitLab CI, SonarQube quality gates. I've reduced release cycles by ~20% and eliminated manual review bottlenecks. Next: Kubernetes orchestration and IaC with Terraform.
MLFlow experiment tracking, model versioning, FastAPI serving. Next: feature stores, automated retraining pipelines, and model monitoring in production (drift detection, A/B rollout).
17-stage Medallion pipeline (Bronze→Silver→Gold→Nectar), Parquet schemas, automated data quality gates. Next: Apache Airflow / Dagster for orchestration, dbt for transformations, streaming with Kafka.
Applied ML (CNN, RNN, anomaly detection), GPT integration for automation use cases. Next: RAG pipelines, AI agents, fine-tuning LLMs, and vector databases (Pinecone, Weaviate).
By 2028, I want to lead the design of end-to-end automation platforms where CI/CD, data pipelines, ML models, and AI agents work as one system. Not replacing engineers, but giving every engineer the leverage of a full operations team. Kubernetes for orchestration, Terraform for infrastructure, MLFlow for models, and custom AI agents that handle the rest.
Kubernetes (CKA) certification by end of 2026. Terraform for IaC in current projects. Apache Airflow for pipeline orchestration. RAG pipelines with LangChain for intelligent document processing. Building in public, shipping real projects, and documenting the journey here.