Narendra Kumar
Phone no: +1 404-474-3792
Email:
[email protected] Linked in URL: LinkedIn
Professional Summary
Summary
5 years of experience as a Snowflake Developer/Admin, specializing in designing, developing, and managing
scalable cloud data warehousing solutions.
Expertise in Snowflake features such as Streams, Tasks, Time Travel, Zero-Copy Cloning, and External Functions
to enable efficient ETL pipelines, real-time data processing, and seamless machine learning integration.
Proficient in advanced query optimization, schema design (Star and Snowflake Schema), and implementing best
practices for cost-efficient and high-performance Snowflake operations.
Skilled in integrating Snowflake with popular cloud services including Azure (Azure Data Factory, Synapse
Analytics, Azure ML), AWS (S3, Redshift, Glue), and Google Cloud (BigQuery, Dataflow) for end-to-end data
workflows.
Strong focus on data security, governance, and compliance (HIPAA, GDPR, SOX) using Role-Based Access Control
(RBAC), row-level security, data masking, and encryption.
Hands-on experience with CI/CD tools (Terraform, Git, Jenkins) to automate infrastructure provisioning and data
pipeline deployments.
Proficient in dbt (Data Build Tool) for data transformation and ensuring consistent data pipeline quality.
Adept at creating dynamic and interactive Power BI dashboards, enabling self-service reporting and data-driven
decision-making across teams.
Experienced in managing semi-structured and structured data (JSON, Parquet, HL7, FHIR) using Snowflake’s Variant
data type and integrating healthcare interoperability standards.
Expertise in machine learning workflows, leveraging MLflow, Azure ML, and Snowflake External Functions for
predictive analytics and operationalizing AI models.
Knowledgeable in data sharing, replication, and Snowflake marketplace for effective stakeholder collaboration.
Certified in SnowCore Pro, and skilled in agile methodologies, ensuring delivery of incremental and high-impact
solutions.
Technical Skills
Operating Systems Linux, Windows XP/7/8/10, Mac.
Programming languages Python, Core Java, C, JavaScript, R, Scala.
BigData Technologies HDFS, Hive, MapReduce, Pig, Sqoop, Flume, Oozie, Hadoop distribution,
Snowflake, Spark, Spark Streaming, Kafka,, ETL (Nifi, Talend, Informatica,
SSIS)
Data Warehousing and Snowflake, Azure Synapse Analytics, Teradata, SQL Server, Oracle, Dimensional
Modeling Modeling (Star Schema, Snowflake Schema), Data Vault, Erwin, ER/Studio
Databases Snowflake DB, SQL Server, Oracle 19c, Teradata V17, Azure Cosmos DB,
HBase
Utilities/Tools dbt, Git, GitHub, Bitbucket, Jira, Jenkins, Terraform, Power BI, Tableau, Alteryx,
Maven, Log4j, Ant, Visual Studio Code
Cloud Services GCP (Compute Engine, Cloud Storage, Dataproc, Cloud SQL, Cloud Functions,
Cloud Monitoring, Filestore, Autoscaler, BigQuery, Deployment Manager,
Dataflow, etc.) Azure (Azure Databricks, SaaS, Azure Synapse Analytics, Azure
Service Bus, Azure HDInsight, Azure Synapse Analytics, Azure Data Factory,
Azure Storage Azure Service), AWS (S3, Redshift, AWS Glue)
Data Visualization & ETL Tableau, Power BI, Google Data Studio, dbt, Informatica PowerCenter, SSIS,
Tools Talend, Azure Data Factory, Apache Nifi.
Version Control& Git, GitHub, Bitbucket, Jenkins, Docker, Terraform
Containerization tools
Software Life Cycle Agile, Scrum, Waterfall, CI/CD, DevOps, TDD
License & Certifications
• Snowflake - Snow core pro Certificate
• Azure Data Engineering
• IBM Business Intelligence (BI) Analyst Professional Certificate
• Power BI
Work Experience
Client – UPS, Atlanta, GA Jan 23 - Present
Snowflake Developer/Admin
Responsibilities
Designed, developed, and maintained Snowflake cloud data warehouse solutions to support advanced analytics,
business intelligence, and machine learning workflows in a scalable and efficient manner.
Managed Snowflake objects, including databases, schemas, tables, views, stages, materialized views, file formats,
and secure data sharing to enable seamless collaboration.
Built and optimized data pipelines using Snowflake Streams, Tasks, Stored Procedures, and Snowpipe, ensuring
real-time and batch data ingestion from diverse sources.
Implemented advanced query optimization techniques using clustering keys, data partitioning, pruning strategies,
and result caching, improving query performance by 40%.
Integrated Snowflake with Azure Data Lake, Azure Blob Storage, and Azure Synapse Analytics, creating a
seamless flow of structured and semi-structured data for analytics and reporting.
Developed scalable ETL processes with Azure Data Factory (ADF) and Snowflake, incorporating custom Python
scripts for complex transformations and automated workflows.
Configured Azure Key Vault for secure management of Snowflake credentials, API keys, and sensitive configurations
to ensure data security.
Leveraged Snowflake’s multi-cluster warehouse scaling capabilities to support high-concurrency workloads during
peak business hours.
Deployed Snowflake External Functions to integrate Python-based machine learning models for fraud detection,
predictive analytics, and customer segmentation.
Supported the deployment and scoring of machine learning models using Azure ML and Snowflake, enabling real-
time decision-making capabilities.
Designed and implemented dimensional and relational data models (Star Schema, Snowflake Schema) tailored for
OLAP, OLTP, and machine learning use cases.
Automated critical workflows using dbt (Data Build Tool), enabling version-controlled and modular data
transformations in Snowflake.
Implemented role-based access controls (RBAC), row-level security, and data masking to ensure compliance with
GDPR, HIPAA, and organizational security standards.
Utilized Snowflake’s Time Travel and Zero-Copy Cloning features for disaster recovery, auditing, and environment
duplication, reducing downtime by 50%.
Monitored and optimized Snowflake resource usage using the Resource Monitoring Framework, enabling cost-
efficient operations across multiple environments.
Configured Data Sharing in Snowflake for secure and controlled data distribution across internal teams and external
vendors. Implemented Snowflake Query History and Performance Analytics for proactive monitoring of warehouse
activity and troubleshooting query bottlenecks.
Created custom dashboards in Power BI, integrated with Snowflake, for real-time monitoring of business KPIs,
operational metrics, and machine learning model outputs.
Automated deployment and provisioning of Snowflake environments using Terraform and integrated CI/CD pipelines
with Azure DevOps for seamless version control and deployment.
Enabled seamless data pipeline orchestration with Azure Functions and Snowflake Tasks, ensuring robust data
engineering workflows.
Conducted regular audits of Snowflake’s data sharing agreements, security configurations, and user roles to maintain
enterprise compliance.
Collaborated with cross-functional teams, including data engineers, data scientists, and business analysts, to gather
requirements and deliver high-impact solutions.
Provided Level 2/Level 3 support for Snowflake environments, including incident resolution, performance tuning, and
capacity planning.
Documented comprehensive operational procedures for Snowflake architectures, including best practices for
performance tuning, cost management, and security compliance.
Tech Stack:
Python, SQL, Snowflake, Azure Data Factory, Azure Blob Storage, Azure Data Lake, Azure Synapse Analytics, Azure ML,
Snowflake Streams, Snowflake Tasks, Snowflake External Functions, Snowflake Time Travel, Zero-Copy Cloning, Data
Partitioning, Clustering, Query Optimization, Schema Design, dbt (data build tool), Power BI, Azure Key Vault, Role-Based
Access Control (RBAC), Row-Level Security, Data Masking, Data Encryption, Terraform, Git, Azure DevOps, Jira,
Confluence.
Client – GGK Technologies, Hyderabad, IN Apr 21 – Jul 22
Snowflake Developer/Admin
Responsibilities
Designed and managed the Snowflake data warehouse to integrate healthcare data from EHR systems, claims, and
external data sources into a centralized repository.
Configured Snowflake Streams, Tasks, and Stored Procedures to automate data ingestion, transformation, and
loading workflows.
Leveraged Snowflake Time Travel and Zero-Copy Cloning for data recovery, auditing, and environment duplication.
Implemented Snowflake External Functions to integrate machine learning models for real-time predictions in
healthcare operations.
Built and optimized ETL pipelines using Azure Data Factory to process and ingest large-scale healthcare data into
Snowflake.
Utilized Azure Blob Storage and Azure Data Lake for scalable storage and seamless integration with Snowflake.
Configured Azure Key Vault to securely manage credentials and API keys for Snowflake and other services.
Collaborated with data scientists to prepare and deliver ML-ready datasets using Snowflake and Azure ML.
Deployed and operationalized ML models via Snowflake External Functions to support predictive analytics for
patient care and operational efficiency.
Employed MLflow for tracking, versioning, and managing machine learning models and experiments.
Designed and implemented dimensional data models (e.g., star schema) in Snowflake to support healthcare analytics
and reporting.
Processed and transformed structured and semi-structured data (e.g., HL7, JSON, FHIR) using Snowflake’s Variant
data type and Python.
Developed data marts and pre-aggregated views for self-service reporting and KPI dashboards.
Monitored and optimized Snowflake queries using clustering, partitioning, and query profiling tools.
Implemented resource monitoring and cost optimization strategies in Snowflake to ensure efficient operations.
Applied role-based access control (RBAC), row-level security, and data masking to ensure compliance with HIPAA
and other data privacy regulations.
Conducted encryption and anonymization of sensitive healthcare data to safeguard patient information.
Processed HL7 and FHIR data for seamless integration with EHR systems and downstream analytics.
Built data pipelines to consolidate and analyze claims, patient records, and operational metrics.
Integrated healthcare interoperability standards to enhance data sharing across systems.
Worked in Agile teams, collaborating with data engineers, analysts, and data scientists to deliver incremental features.
Created and maintained detailed documentation for Snowflake architecture, data pipelines, and standard operating
procedures.
Used Terraform to automate the provisioning of cloud infrastructure for Snowflake environments.
Enabled seamless data sharing across stakeholders using Snowflake Data Sharing.
Tech Stack:
Azure Data Factory, Python, Snowflake, Azure Synapse Analytics, Azure Blob Storage, Azure Data Lake, dbt, Power BI,
Azure Key Vault, Azure Monitor, Azure Machine Learning, Snowflake Streams, Snowflake Tasks, Snowflake External
Functions, Snowflake Time Travel, Zero-Copy Cloning, Role-Based Access Control (RBAC), Row-Level Security, Data
Masking, HL7, FHIR, MLflow, Kafka, Terraform, SQL.
Client— Deccan Info Systems, Hyderabad, IN Feb 20 – Mar 21
Snowflake Developer/Admin
Responsibilities
Designed and implemented a scalable Snowflake data warehouse to handle high-volume financial transactions,
integrating data from various internal banking systems and third-party APIs.
Configured Snowflake Streams and Tasks to automate data ingestion, enabling real-time updates for financial
transactions and customer data.
Managed and optimized Snowflake schemas, roles, and permissions to ensure efficient data access control and security.
Leveraged Snowflake Time Travel and Zero-Copy Cloning to support data recovery, auditing, and testing in different
environments.
Utilized AWS S3 for scalable, secure data storage and integration with Snowflake for seamless data processing and
querying.
Employed AWS Glue to automate ETL processes, transforming raw financial data into structured formats for
Snowflake ingestion.
Integrated AWS Redshift with Snowflake for hybrid cloud data processing and reporting.
Developed and deployed machine learning models using AWS SageMaker and Snowflake External Functions to
predict customer churn, assess credit risk, and detect fraudulent transactions.
Leveraged MLflow for model tracking, versioning, and managing the machine learning lifecycle.
Integrated Python and Scikit-learn for feature engineering, model training, and evaluation, operationalizing predictions
directly within Snowflake.
Designed and implemented dimensional models (star schema) in Snowflake, optimizing them for financial reporting
and analytics, including customer segmentation, transaction categorization, and fraud detection.
Used dbt (data build tool) for data transformation and ensuring high-quality, consistent data pipelines.
Created data marts and pre-aggregated views in Snowflake to support business intelligence and self-service reporting
for financial analysts and decision-makers.
Optimized Snowflake queries to improve performance, reducing reporting latency by up to 30%.
Used Snowflake clustering and partitioning strategies to ensure efficient data retrieval from large financial datasets.
Implemented resource monitoring and cost optimization strategies to ensure efficient use of Snowflake compute
resources.
Applied Role-Based Access Control (RBAC), Row-Level Security (RLS), and Data Masking in Snowflake to
protect sensitive customer data and ensure compliance with financial regulations (e.g., SOX, GDPR).
Managed data encryption and implemented data anonymization techniques to safeguard financial transactions and
customer information.
Automated infrastructure provisioning and configuration using Terraform to ensure consistency and scalability of
cloud resources.
Utilized AWS IAM for managing access controls and ensuring proper authorization across financial data workflows.
Developed CI/CD pipelines using GitHub Actions and Jenkins to streamline the deployment of data workflows and
machine learning models.
Collaborated with data scientists, financial analysts, and compliance teams to design and implement data-driven
solutions, such as fraud detection and credit risk assessment.
Participated in Agile Scrum processes, including sprint planning, daily stand-ups, and retrospectives, to ensure timely
delivery of data engineering solutions.
Created and optimized Power BI dashboards to monitor key financial metrics, such as customer churn rate, fraud
detection alerts, and credit scoring trends.
Provided data insights to business stakeholders by enabling self-service reporting and data exploration using
Snowflake-integrated Power BI reports.
Tech Stack:
Snowflake, SQL, Python, AWS S3, AWS Glue, AWS Redshift, AWS SageMaker, TensorFlow, Scikit-learn, MLflow, Power
BI, Snowflake Streams, Snowflake Tasks, Snowflake External Functions, Snowflake Time Travel, Zero-Copy Cloning, Role-
Based Access Control (RBAC), Row-Level Security (RLS), dbt, Terraform, GitHub Actions, Jenkins, AWS IAM, Snowflake
Clustering, Data Masking, Data Encryption, Snowflake Data Sharing, Snowflake Query Optimization, Agile methodologies