0% found this document useful (0 votes)
40 views5 pages

Subramanyam - Korlakunta 8+ Years - Telend

Subramanyam.K is an experienced IT professional with over 8 years in development, testing, and maintenance, specializing in Data Warehousing projects using ETL tools like Talend and Informatica. He has a strong background in data extraction, transformation, and loading from various sources, along with expertise in SQL and Python. His work experience includes roles at UST Global Technologies, Capgemini, and others, with significant projects involving Talend and data migration for clients such as Equifax and GE Power.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views5 pages

Subramanyam - Korlakunta 8+ Years - Telend

Subramanyam.K is an experienced IT professional with over 8 years in development, testing, and maintenance, specializing in Data Warehousing projects using ETL tools like Talend and Informatica. He has a strong background in data extraction, transformation, and loading from various sources, along with expertise in SQL and Python. His work experience includes roles at UST Global Technologies, Capgemini, and others, with significant projects involving Talend and data migration for clients such as Equifax and GE Power.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd

SUBRAMANYAM.

K
Mobile: +91-9742339551 Email: [email protected] and [email protected]

Professional Summary
• Having 8 + years of experience in IT Industry involving Development, Testing and Maintenance with focus on
Data Warehousing Projects using ETL tools like Talend 7.x/6.x and Informatica 9.x/8.x
• Having close to 4 years of experience on Talend7. x. Suite for DI and Knowledge on Talend Studio for BD.
• Data Extraction, Transformation and Loading using Talend and designed data conversions from a wide
variety of source and Target systems including Oracle, Snowflake DB, Hive and non- relational sources like
flat files.
• Extensively created jobs in TALEND using tMap, tJoin, tAggregateRow, tFilterRow, tNormalize,
tDenormalize, tFixedFlowInput, tJavarow, tJavaFlex, tSnowflakerow, tOracleScd, tFileInputDelimited,
tFileOutputDelimited for transformation and connection component’s like tFTPConnection,
tOracleConnection etc.
• Experienced in creating Trigger on TAC server to schedule Talend jobs to run on server.
• Strong experience in Extraction, Transformation, loading (ETL) data from various sources into Data
Warehouses using INFORMATICA PC9 (Repository, Designer, Workflow Manager, Workflow Monitor).
• Developed for SCD Type –II as well as Incremental logic according to business requirements.
• Good understand of Data Warehousing concepts.
• Knowledge on Python scripts and used in Talend.
• Strong experience skills in SQL.
• Hands on experience in working with Jira, Rally, HP QC - Quality Center System, GitHub - Versioning
applications and ServiceNow for Ticketing & Change Request Management applications.
• Excellent overall Software Development life cycle (SDLC) and exposure in each phase activity of SDLC.
• Excellent communication skills and analytical skills and ability to perform as part of the team.

Work Experience
• Working as Systems Analyst for UST Global Technologies from Jan’2020 to till data.
• Worked as Consultant for Capgemini, Bangalore from Mar’2018 to Dec’2019.
• Worked as Sr. Software Engineer for Emids Technologies, Bangalore from Nov’2017 to Mar’2018.
• Worked as System Analyst for Atos India Pvt.Ltd., Bangalore from Jan’2015 to Nov’2017.
• Worked as Associate Consultant for HCL Technologies Ltd, Hyderabad from Sep’2011 to Jan’2013.

Qualification
• M.C.A (Master of Computer Application) from SV University, Tirupathi.
• B.C.A (Bachelor of Computer Application) from SV University, Tirupathi.

Technology
ETL Tools : Talend 7.x /6.x and Informatica8.x/9.x.
Databases : Oracle, PostgreSQL and Snowflake.
Languages : SQL, Python and Java
Operating System : Windows 10 and UNIX.
Front end tool : Toad, SQL Developer, DBeaver, Putty, WinSCP and FileZilla.
Schedule Tools : TAC, Window Scheduler and TWS.

Projects
Project Name Environment Client Name Period
EFXINTL Talend7.1.1 , Oracle11g Equifax, USA Jul’2020 to till date
Nissan Digital India
AMI Talend 7.1.1 and Snowflake Jan’2020 to Jun’2020
LLP
Ops-Metrics Talend, Greenplum5.12 GE POWER, USA Mar’2018 to Dec’2019
ADSE Talend, Oracle, Salesforce & ADSL IQVIA, IND. Nov’2017 to Mar’2018
Web-ADSL Informatica & Oracle One Web, USA Jan’2015 to Oct’2017
CritSit-CRM Informatica & Oracle Microsoft, USA Sep’2011 to Jan’2013

PROJECT # 6

Project Name EFXINTL migration Client Equifax, USA

Technologies Talend7.1.1, Oracle, GitHub and Bitbucket.


Duration Jul’2020 to Present Team Size 2 Role ETL Developer
Project Description Currently the EFXINTL Revenue monthly loads are done using an IBM ETL tool called DataStage.
The ETL solutions need to be migrated to Talend. Reviewing some market studies, for example
the Gartner magic quadrant for ETLS + experience, the Finance BI Team decided to choose
Talend.

Besides the migration itself to guarantee the continuity of the financial monthly close, the
project looks to improve the process to speed up the data availability for reporting and
capturing the data like Product, Customer, New Product identifications, Vertical information.
Responsibilities • Understanding the business requirement as per client required to create Talend Jobs.
• Extracted data from Homogeneous and Heterogeneous sources such as Flat Files and
Relation DB to load into Data Warehouse using Talend Jobs.
• Performed data manipulations using Talend components for DI like tMap, tJoin,
tReplicate, tConvertType, tFlowToIterate, tAggregateRow, tFilterRow,
tRowGenerator, tjava, tJavaRow, tJavaFlex, tSqlRow, tNormalize, tDenormalize,
tSetGlobalVar, tDie, tLogCatcher, tOracleScd etc.
• Used flat file and Database components like tFileInputDelimited,
FileOutputDelimited, tFileList, tFixedFlowInput, tFileInputFullRow, tFileInputRaw,
tOracleInput, tOracleOutput, tGreenPlumInput and tGreenPlumOutput etc.
• Used Talend Big data components like tHDFSInput, tHDFSOutput, tHiveInput, etc
• Developed the job with custom SCD Type-II to keep track of historical data in Talend.
• Developed INCREMENTAL Loading logic to load data into DWH in Talend.
• Implemented multiple files with different schema load into multiple targets in single
flow.
• Implemented single source to generate multiple targets dynamically.
• Used context parameters in Talend jobs like context variables and global Map
variables.
• Created jobs to pass parameters from Child to Parent and Parent to Child job.
• Generate the Sequence number to combine the rows and eliminate the duplicates in
Talend.
• Worked on improving the performance of Talend jobs.
• Involved multiple Subjobs/multiple jobs are executing in a Single Job.
• Used tRunJob and tParallelize components to run jobs sequence as well as parallel.
• Worked on TAC to create tasks and deploy the Talend jobs.
• Created triggers for Talend jobs to run automatically or to schedule on server.
• Used Error Recovery in Talend Studio and Recovery Management in TAC.
• Creating execution plans on TAC (Talend Administration Centre).
• Worked on Joblet to create Talend jobs.
• Run the Python scripts by using Talend.
• Uploading Functions to GITHUB repository for peer review.
• Attending of Scrum Calls as per the process Agile Methodologies of the project.
• Used HP QC as a defect tool and Jira for the work allocation and the time logging.
• Fixing of bugs during UAT testing and Production bug fix after deployment
(Hypercare).
• Post-go live support.

PROJECT # 5

Project Name AMI Client Nissan Digital India.

Technologies Talend 7.1.1 and Snowflokecomputing.com (Snowflake)


Duration Jan’2020 to Jun’2020 Team Size 6 Role ETL Developer
Project Description AMI Stands for Africa Middle East India, Extracting the data from SAP as well as files to load data
into Snowflake Cloud by using Talend. To extract the data from multiple counties like NMIPL,
RNIPL, NOAS, NKSA, NMEF, NSA, NMEG.

Responsibilities • Understanding the business requirement as per client required to create Talend Jobs.
• Extracted data from sources such as Flat Files, SAP data to load into EDW AWS
(Snowflake) using Talend.
• Implemented Talend jobs for DI like SCD and Email notification.
• To Schedule the job by using TAC.
• Monitored the daily runs, weekly runs and adhoc runs to load data into the target.
• Fixing of bugs during UAT testing and Production bug fix after deployment.
• Post-go live support.

PROJECT # 4

Project Name Ops-Metrics Client GE POWER , USA

Technologies Talend 6.2, Greenplum5.12 and Oracle11g


Duration Mar’2018 to Dec’2019 Team Size 8 Role ETL Developer
Project Description GE (General Electric) is an American Multinational Conglomerate which extends its businesses
into multiple sectors where-in GE Power is one among them headquartered in Atlanta.GE Power
is currently split as Power Conversion and Power Services where Gas Power, Steam Power and
Grid Solutions come under Power Conversion while Power Engineering and PMA Engineering
come under Power Services support these domains. Ops-Metrics Project comes under the Power
Engineering segment where Visualization Dashboards are maintained. These Virtualization
Dashboards (Reports) hold critical data in them which internally support the Power Engineering
segment. Tender details, Contractor details, Ledger and PO-details, Annual Budget with an
insight of expected budget and expense for coming 3 years, in addition to this GE Power
Employees information, various Shop orders and their analysis are maintained in parallel.

Responsibilities • Understanding the business requirement as per client required to create Talend Jobs.
• Extracted data from Homogeneous and Heterogeneous sources such as Flat Files and
Relation DB to load into Data Warehouse using Talend and Informatica.
• Extracted data from sources such as Flat Files, Salesforce and XML data to load into
AWS data lake (Greenplum) using Talend Jobs.
• Implemented Talend jobs for DI and Involved POC’s using Talend for Big Data.
• Monitored the daily runs, weekly runs and adhoc runs to load data into the target.
• Involved in Performance Tuning by analysing the bottlenecks at mapping level.
• Fixing of bugs during UAT testing and Production bug fix after deployment.
• Post-go live support.

PROJECT # 3

Project Name ADSE Client Quintiles (IQVIA), Bangalore

Technologies Talend6.2 and Oracle11g


Duration Nov’2017 to Mar’2018 Team Size 7 Role ETL Developer
Project Description IQVIA, formerly Quintiles IMS Holdings, Inc., is an American multinational company serving the
combined industries of health information technologies and clinical research. It is a provider of
biopharmaceutical development and commercial outsourcing services. RC sends messages
(source Xml) to Active-MQ. Messages which are consumed from Active-MQ will be stored in
table structure in FODS tables. In general terms these systems collect patient level data on
procedures and outcomes and provide feedback to the participants and those are SSR, NSQIP,
TRAUMA, RDC, for these systems IRP develop a product to ACS by integrating the data from SSR,
NSQIP, TRAUMA, RDC to the ODS.

Responsibilities • Understanding the business requirement as per client required to create Talend Jobs.
• Extracted data from source such as Flat Files, Salesforce and XML data to load into
ADLS using Talend Jobs.
• Developed the job to remove the duplicate source records using tmap.
• Design a Job in transforming one record into multiple records. (Data pivoting).
• Created a series of Sequence numbers rather than using tmap.

PROJECT # 2

Project Name Web-ADSL Client OneWeb , USA

Technologies Informatica9.1 & Oracle10g


Duration Jan’2015 to Oct’2017 Team Size 7 Role ETL Developer
Project Description OneWeb is formerly known as Worldview Satellites, To extract multiple sources like Salesforce,
Science logic and Remedy (db.) to load data into Azure Data Lake Store (ADLS) with Data Bricks
process engine through Informatica ETL tool.

Input data: Input data is coming from 3 different sources like

• Vlocity (Salesforce): Read CSV data files with comma separator from source for Vlocity.

• Science Logic: Read XML files from source for Science Logic data (through RESTAPI).

• Remedy: Read RDBMS tables from source for Remedy (Database).

Output data: All above different sources to load into ADLS through Talend jobs.

Responsibilities • Understanding the business requirement as per client required to create mappings.
• Extracted data from Homogeneous and Heterogeneous source such as Flat Files and
Relation DB to load into Data Warehouse using Informatica.
• Developed the job with custom SCD Type-II to keep track of historical data.
• Developed INCREMENTAL Loading logic to load data into DWH in informatica
• Good Knowledge on Star schema and Snowflake schema.
• Well acquainted with the UNIX commands.
• Involved PoC and Implemented Talend jobs for DI.
• Monitored the daily runs, weekly runs and adhoc runs to load data into the target.
• Involved in Performance Tuning by analysing the bottlenecks at mapping level.
• Fixing of bugs during UAT testing and Production bug fix after deployment.
Post-go live support.

PROJECT # 1

Project Name CritSit – CRM Client Microsoft, USA.

Technologies Informatica8.6 & Oracle10g


Duration Sep’2011 to Jan’2013 Team Size 9 Role ETL Developer
Project Description CritSit is a web-application that keeps track of the high priority issues opened for customer
having premier contract. CritSit pulls almost all of its data from different systems and the
following table shows the integration of CritSit with other systems like MS Solve, CAP, and SIE.

Responsibilities • Understand the Requirements and Functionality of the application


• Thoroughly understanding of source systems.
• Worked on Power Center client Designer tools like Source Analyser, Warehouse
Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
• Implemented performance tuning to increase the performance of the loading process.
• Supported for UAT and Production Go Live.

Declaration: I hereby declare that all the details furnished above are true to the best of my knowledge.

Place : Bangalore

Date : (K.SUBRAMANYAM)

You might also like