Intel
First sync: 2022. I saw the network for what it was—a mess of vulnerable, inefficient data streams. A problem that required a specific skillset. I went to USC to forge that skillset, specializing in the architecture of big data systems. That's where I learned to run the grid, not let it run me.
My function: design and deploy high-velocity data infrastructures. I build the pathways, orchestrate the transfer protocols, and model the output. I take corporate data chaos and forge it into a clean, actionable signal.
Downtime is spent off-grid. Exploring the city's verticality, interfacing with classic sci-fi, and running analog simulations in the kitchen.
Core Programming
Master of Science, Computer Science
University of Southern California - Viterbi School of Engineering
Bachelor's, Computer Science & Engineering (Big Data Analytics)
SRM Institute of Science & Technology
Toolkit
Languages & Scripting
Data Engineering Tools
Cloud Platforms & Data Storage
Data Warehousing & Databases
Data Visualization & BI
DevOps & Governance
Past Runs
AI ENGINEER // AI Power [Battery EVO]
Designed and deployed conversational AI chatbots for e-commerce platforms, significantly boosting conversion rates by personalizing user engagement. Implemented a full-stack, event-driven workflow management tool to streamline customer service, sales, product return and inventory events across departments.
LEAD RUNNER (R&D) // Andrew & Erna Viterbi School of Engineering (USC)
Engineered a containerized self-evolving multi-agent RAG framework to orchestrate complex, multi-hop knowledge synthesis for real-time analytics, performance optimization and query automation.
INFILTRATION & ANALYTICS // Annenberg School of Journalism & Communication (USC)
Built scalable ETL pipelines in Snowflake and real-time dashboards (Power BI/Streamlit), reducing equipment overdue incidents and minimizing AV/network downtime; optimized staffing and CRM workflows, cutting response times.
FREELANCE ANALYST (FINANCE) // GMG Associates
Automated ERP, CRM, and tax data migration to Azure using Airflow with validation scripts and unit testing, reducing reporting errors and saving 10+ manual hours daily.
DEEP LEARNING RESEARCH ASSISTANT // SRM Institute of Science & Technology
Engineered a GNN architecture, combining GNNs, GINs and GRUs for spatiotemporal PM2.5 forecasting across 184 cities, achieving 7.2% lower RMSE and 12.4% higher CSI scores.
DATA ENGINEER // Lemonpeak
Implemented Lambda Architecture principles to build sub-second latency pipelines for real-time subscriber analytics and scalable batch processing from set-top boxes, mobile apps, and user engagement platforms.
SOFTWARE DEVELOPMENT INTERN // Yatnam Technologies Private Limited.
Built a scalable UK-focused online e-commerce store with warehouse management and an ERP system with REST APIs, ETL pipelines, and CI/CD for Azure data pipelines.
DATA ENGINEER // Arcadia Solutions Inc
Built reliable streaming pipelines for high-volume clickstream data ingestion, tracking cart abandonment in real time and triggering automated retargeting sale campaigns, boosting conversions by 5%.
Contracts
CONTRACT: NIGHTINGALE
Constructed a secure, end-to-end HIPAA-compliant biometric analytics pipeline using Azure, Spark, and Snowflake. Deployed a readmission risk model with Spark MLlib, integrating MLflow for secure delivery.
Completed: Dec 2024
CONTRACT: GHOST-STREAM
Routed IoT telemetry from a firewalled Azure Blob to AWS S3 via Kinesis. Transformed data packets with Lambda and invoked SageMaker for real-time anomaly detection before injection into Snowflake via Snowpipe.
Completed: Sep 2024
CONTRACT: GRIDLOCK
Simulated real-time urban traffic data streams with Kafka & Spark Streaming. Built an XGBoost model to forecast trip durations, exposed via a REST API to process live JSON packets from the stream.
Completed: Jul 2024
Runner's Manifesto
1. Data is the lifeline. Information is the most valuable asset in the grid. My primary directive is to ensure its integrity, security, and velocity. Every byte has a purpose; every stream, a destination. I don't just move data; I ensure it arrives with meaning.
2. Efficiency is the edge. In the digital expanse, speed is survival. I build pipelines that are not just fast, but ruthlessly efficient. I optimize for performance, eliminate bottlenecks, and ensure that every process is streamlined for peak operational capacity. Wasted cycles are vulnerabilities.
3. Security is the foundation. A breach in the data stream is a critical failure. I design with a security-first mindset, architecting systems that are resilient to intrusion and fortified against threats. Trust is built on the bedrock of secure infrastructure.
4. Adapt or be deprecated. The grid is in constant flux. Technologies evolve, and threats adapt. My commitment is to continuous learning and evolution. I stay ahead of the curve, integrating new tools and methodologies to maintain a competitive edge and ensure the systems I build are future-proof.
Secure Channel
Have a contract or need to establish a secure line of communication? Use the form below. All transmissions are encrypted and routed directly to my terminal.