Vikram Raju Nadampalli (Green Card Holder)
Senior Data Architect/Modeler
[Link]@[Link]/ (848) 256-0477
Summary
Accomplished Senior Data Architect/Modeler with over 10+years of extensive experience in designing, implementing, and optimizing
data solutions on Azure.
Proficient in Azure Data Lake, SQL Database, Databricks, and PowerShell scripting.
Strong background in data architecture, ETL processes, and data warehousing.
Exceptional problem-solving skills with a track record of delivering scalable and efficient data solutions.
Experience working with data modeling tools like Erwin, SAP Power Designer and ER/Studio.
Experience in designing Logical, Physical & Conceptual data models for to build the Data Warehouse.
Experience in creating models for Oracle/Teradata/ SQL Server/DB2.
Excellent understanding of an Approach to MDM to creating a data dictionary, Using Informatica or other tools to do mapping from
sources to the Target MDM Data Model.
Experience in Agile Methodologies and SCRUM Process.
Experience in JIRA software for Plan, Track, and Report and Release management.
Experience in developing data models which will serve both OLTP and OLAP functionality as per business needs.
Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
Sound knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
Efficient in with Normalization (1NF, 2NF and 3NF) and De - normalization techniques for improved database
Extensively experience on Excel, Pivot tables to run and analyze the result data set and perform UNIX scripting.
Expertise in python-based environment along with analytics, data wrangling and excel data extracts.
Experience in rendering and delivering reports in desired formats by using reporting tools such as Tableau.
Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS).
Experience with Tableau in analysis and creation of dashboard and user stories.
Well versed in process flow diagrams, entityrelationship diagrams during Analysis and Design using Visio.
Excellent problem solving and analytical skills with exceptional ability to learn and master new technologies efficiently
Technical Skills
Cloud Services:Azure and AWS.
Data Modeling Tools: ER/Studio, Erwin R2Sp2, SAP Power Designer.
Big Data: Hadoop3.3, Hive2.3, Sqoop1.4. Pig0.17.
Databases: SQL Server, DB2, Teradata
Development Methodologies: Agile, Scrum, Waterfall
OLAP & ETL Tools: Tableau, SSIS, Talend, Informatica Power Center
Reporting Tools: MS Excel, Tableau, Power BI, SSIS
Programming languages: SQL, Python, PowerShell.
Work Experience
State of PA - Mechanicsburg, PA April 2018 - Till Present
Senior Data Architect/Modeler
Responsibilities:
Architected and implemented data solutions using Azure Data Lake, Azure SQL Database, and Azure Databricks to support advanced
analytics and data warehousing needs.
Led agile development teams to deliver high-quality data systems within short time frames.
Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
Designed and developed ETL processes using Azure Data Factory and Databricks to ensure seamless data integration and transformation.
Optimized data storage and retrieval by implementing best practices for Azure Data Lake and SQL Database, improving query
performance and reducing costs.
Utilized ER/Studio for comprehensive data modeling tasks, including conceptual, logical, and physical models.
Developed data ingestion pipelines using Azure Data Factory to ingest data into Azure Data Lake.
Designed and implemented SQL databases for storing structured data in Azure.
Automated Azure resource provisioning and management tasks using PowerShell scripts.
Developed ETL pipelines using Databricks to transform and load data into target systems.
Developed stored procedures, triggers, and functions to automate data processing tasks in SQL databases.
Led significant architectural projects, leveraging ER/Studio to ensure data models were robust.
Optimized SQL queries for performance and efficiency in handling large datasets.
Implemented database security measures such as role-based access control and encryption in SQL databases.
Selected, evaluated, and implemented architectural procedures and techniques using ER/Studio to complete complex data modeling
tasks efficiently.
Created and maintained conceptual, logical, and physical data models to represent data requirements and relationships.
Conducted database schema design reviews to ensure proper data modeling and normalization.
Conducted Databricks cluster management and optimization for resource utilization.
Optimized data models and database schemas for performance, scalability, and efficiency.
Worked on database migration projects to move on-premises databases to Azure SQL Database.
Led significant architectural projects, leveraging ER/Studio to ensure data models were robust and aligned with strategic objectives.
Collaborated with cross-functional teams to troubleshoot and resolve database performance issues.
Captured, stored and processed both structured and unstructured data using SQL.
Used Azure cloud services such as Databricks, Data Lake, and Azure SQL Synapse.
Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
Managed database performance monitoring and tuning activities to improve system responsiveness.
Provided technical guidance and support to developers on using SQL databases for application development.
Worked on data lake migration projects to move data from on-premises storage to Azure Data Lake.
Designed and developed user defined functions, stored procedures, triggers for Cosmos DB.
Developed data mapping, data governance and transformation and cleansing rules for the Master Data Management (MDM) Process.
Developed comprehensive documentation using ER/Studio to support data models, including technical specifications, design documents,
and data dictionaries.
Created Logical and Physical Data Models using Erwin.
Provided training and documentation for data analysts on using Azure Data Lake for analytics and reporting.
Designed and implemented REST APIs to access the SQL database platform.
Analyzed business and technical requirements to develop comprehensive data reporting strategies.
Used Power BI to create dashboards and participated in the process of choosing the right tool for Dashboards and Analytics.
Tools:Agile, Azure, Erwin R2Sp2, Agile, SQL, Data lake, Databricks, PowerShell, Power BI, Rest API.
DST Systems - Kansas City, MO June 2016 - March 2018
Data Architect
Responsibilities:
Worked with Data Architect to implement data model changes in database in all environments.
Collaborated with stakeholders to gather requirements and design data solutions that align with business goals and objectives.
Conducted Design discussions and meetings to come out with the appropriate Data Model.
Participated in gathering and defining of business requirements to support the team across all projects.
Worked with Data Stewards and Business analysts to gather requirements for MDM Project.
Extensively used Agile Method for daily scrum to discuss the project related information.
Orchestrated data movement between different storage systems, including Azure Blob Storage, Azure SQL Database, and Azure Data
Lake Storage, utilizing Azure Data Factory.
Managed deployments and migrations of services to MS Azure.
Worked on data integrity and queries execution efficiency applying knowledge of Azure Data Bricks
Creating Logical/physical Data Model in ERwin and have worked on loading the tables in the Data Warehouse.
Performed data analysis on source data to be transferred into an MDM structure.
Designed and implemented a fully operational production grade large scale data solution on Snowflake DataWarehouse.
Used Azure reporting services to upload and download reports.
Participated in performance management and tuning for stored procedures, tables and database servers.
Applied Data Governance rules for primary qualifier, Class words and valid abbreviation in table name and Column names.
Configured and managed data integration from on-premises databases to Azure cloud environments through Azure Data Factory.
Used Azure Data Factory’s mapping data flows to perform complex data transformations and enrichments.
Performed
Developed Python utility to validate HDFS tables with source tables.
Worked with Azure BLOB and Data lake storage and loading data into Azure SQL Synapseanalytics (DW).
Created logical data model from the conceptual model and its conversion into the physical database design using ERWIN.
Involved in using ETL tool Informatica to populate the database, data transformation from the old database to the new database using
Oracle.
Integrated ER/Studio with other data management tools and systems to ensure seamless data operations and workflows.
Set up and managed Hadoop clusters to process and analyze large-scale datasets.
Wrote complex SQL scripts to analyze data present in different Data warehouse’s like Snowflake.
Created complex program unit using PL/SQL Records, Collection types
Created and maintained Metadata, including table, column definitions.
Worked for map reduce and query optimization for Hadoop hive and HBase architecture.
Worked extensively on importing and exported data into HDFS using Sqoop.
Analyzed and optimized business processes within Dynamics AX to improve efficiency and performance.
Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
Worked with DBA to create the physical model and tables.
Created designs and process flows on how to standardize Power BI dashboards to meet the business requirement.
Created and modified various Stored Procedures used in the application using T-SQL.
Tools:Erwin, Oracle, PL/SQL Agile, Azure, Snowflake, Power BI, HBase, MDM, Python, T-SQL.
BCBS - Kansas City, KS Oct 2015 - May 2016
Data Modeler/Architect
Responsibilities:
Documented all data mapping and transformation processes in the Functional Design documents based on the business requirements.
Generated and DDL (Data Definition Language) scripts using Erwin and assisted DBA in PhysicalImplementation of data Models.
Created conceptual, logical and physicalmodels for DataWarehouseDataVault and Data Mart.
Used JIRA as an agile tool to keep track of the stories that were worked on using the agile methodology.
Implemented a proof of concept deploying this product in Amazon Web Services AWS.
Created Fast Export, MultiLoad, Fast Load for batch Processing.
Worked on cloud technologies and Involved in Amazon EC2 and S3 and supporting both the development and production environment.
Worked on migrating of EDW to AWS using EMR and various other technologies
Responsible for full data loads from production to AWS Redshift staging environment.
Worked on both OLTP and OLAPdatabases and creating performance benchmarks and generating SQL*Plusreports.
Performed AWS lambda functions on S3.
Documented Informatica mappings in Excel spread sheet.
Created data masking mappings to mask the sensitive data between production and test environment.
Designed the Data Marts in dimensional Data modeling using star and snowflake schemas.
Developed ETL data mapping and loading logic for MDMloading from internal and external sources.
Defined and deployed monitoring, metrics, and logging systems on AWS.
Extensively used Star and SnowflakeSchema methodologies.
Automated data modeling tasks and processes within ER/Studio to increase efficiency and reduce manual effort.
Involved in database development by creating Oracle PL/SQL Functions, Procedures and Collections.
Used AWS glue catalog with crawler to get the data from S3 and perform sql query operations
Gathered requirements, analyzed and wrote the design documents.
Configured security roles and permissions within Dynamics AX to ensure data protection and compliance.
Created custom reports and dashboards in Dynamics AX to provide actionable insights to stakeholders.
Used GIT for code reviews during stages of branching, merging and staging.
Designed data process flows using Informatica to source data into Statements database on Oracle platform.
Prepared Dashboards using calculations, parameters in Tableau.
Planned and defined system requirements to Use Case Scenario and Use Case Narrative using the UML methodologies.
Tools:Erwin, Oracle PL/SQL, Agile, AWS, JIRA, OLAP, OLTP, GIT, Informatica, Tableau, MDM, MS Excel.
Penton Media- Overland Park, KS Feb 2014 - Sep 2015
Data Analyst/Data Modeler
Responsibilities:
As a Data Analyst/Data Modeler, I was responsible for all data related aspects of a project.
Involved in the data analysis and database modeling of both OLTP and OLAP environment.
Provided data sourcing methodology, resource management and performance monitoring for data acquisition.
Extensively worked on the ETL mappings, analysis and documentation of OLAP reportsrequirements.
Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
Gathered the various reporting requirement from the business analysts.
Created Physical Data Model from the Logical Data Model using Compare and Merge Utility in ER/Studio.
Developed Snowflake Schemas by normalizing the dimension tables as appropriate.
Developed MDM metadata dictionary and naming convention across enterprise.
Used Teradata Fast Load utility to load large volumes of data into empty Teradata tables in the Data mart for the Initial load.
Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
Worked on data validation to ensure the accuracy of the data between the warehouses.
Utilized forward/reverse engineering tools and target database schema conversion process.
Worked on Key performance Indicators (KPIs),design of starschema and snowflakeschema in AnalysisServices (SSAS).
Performed data reconciliation between integrated systems.
Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
Generated DDL scripts for database modification, Teradata, Macros, Views and set tables.
Created Complex SQL Queries using Views, Indexes, Triggers, Roles, Stored procedures and User Defined Functions worked with different
methods of logging in SSIS.
Involved in writing stored procedures and packages using PL/SQL.
Designed and developed dynamic advanced T-SQL, stored procedures, XML, user defined functions, parameters, views, tables, triggers
and indexes.
Tools:SQL, SSIS, SSAS, OLAP, OLTP ER/Studio, Teradata, MDM, T-SQL, PL/SQL.
Education: B Tech (CSE), JNTU Kakinada, 2012