Ravi Kumar R

Ravi Kumar R

OBIEE Developer

Followers of Ravi Kumar R2000 followers
location of Ravi Kumar RDuluth, Georgia, United States

Connect with Ravi Kumar R to Send Message

Connect

Connect with Ravi Kumar R to Send Message

Connect
  • Timeline

  • About me

    Senior ETL Developer& Snowflake Lead at Corp-Corp Opportunities

  • Education

    • National Texas university

      2002 - 2005
      National Texas uni Computer Science
  • Experience

    • Verizon Data Services

      Jan 2007 - Jul 2009
      OBIEE Developer

      OBIEE Developer ● Experience in designing the data architecture of the pharmacy systems data warehouse. ● Handled Multiple Projects at a time and successfully met several intersecting deadlines to yield Business Deliverables. ● Conducted sessions with business to negotiate the business requirements by bridging the gap between business needs and the data warehousing system. ● Perform gap analysis to meet the project requirements. ● Thorough analysis of the data map layout sent from source system before transforming them into logical structure. ● Experience in designing the Conceptual, Logical and Physical data model using the Erwin Data Modeler, MS Visio. ● Identified and compiled the various tables related to opportunity and retail systems and created distinct subject areas. ● Specified entities, attributes and relationships in the Erwin data model after analysis of database system. ● Experience in implementing slowly changing dimensions Type2 and Type3 for accessing history of reference data changes. ● Creation of aggregate and rollup tables from dimension tables to meet the reporting requirements. ● Creation of views and materialized views to be used in reporting. ● Used Reverse Engineering and Forward Engineering techniques on databases from DDL scripts and created tables and models in data mart, data warehouse and staging. ● Used Ralph Kimball methodologies to perform dimensional Modeling on the OLAP system. ● Developed and maintained data dictionary to create metadata reports for technical and business purpose. ● Employed Naming Standard Editor for defining new naming standards for the entities, attributes, domains, columns and tables that applied consistently all over the enterprise. ● Created Standard documents for generation DDLs from Erwin Data Modeler via Forward Engineering,ModelMergeintothemaster Data Model. Show less

    • One Link Mortgage & Financial

      Aug 2009 - Aug 2010
      OBIEE Developer

      • Understanding the process flow and business functionalities of EUS Input data warehouse having the History Model and the Expiring Model, which are used by the EUS to calculate the Scores and the required Variables. • Analyzed the Conceptual and the Physical data model for EUSPA, which included the five source systems, EUS Staging, PA EUS Input data warehouse (Target). • Attend meetings with the end clients to decide the different LOB’s in scope for each different components for EUSPA, understanding the Functional Specifications and their relationship between the Accounts, Policies, Claims and other attributes. • Created the Detail Design documents which have the technical specifications for the given functionality, overall process flow for each particular process, Flow diagrams, Mapping spreadsheets, issues, assumptions, configurations, Informatica code details, shell scripts etc. and conducted meetings with the clients for the Approval of the process.• Developed Logical and Physical dimensional data models that capture current state/future state data elements and data flows consisting of families of stars using Erwin. • Responsible for creation of Data Abstraction model and creation of XML messages using Data Abstraction model.• Used ERWIN & Power Designer to simplify and visualize tables, views and relationships instantly in a contextual diagram without any intermediate step.• Extensively used ERWIN& Power Designer to design logical, physical and domain models and visually represented in diagrams using Information Engineering (IE) notation.• Extensively implemented and governed the corporate naming standards according to the company-specified standards.• Discovered potential relationships between disparate data sources, and to compare and synchronize the structure of data sources/targets Show less

    • Indiana State Department of Health (ISDH)

      Sept 2010 - Apr 2013
      ETL Developer

      Responsibilities:● Gathering Business requirements by organizing and managing meetings with business stakeholders, Application architects, Technical architects and IT analysts on a scheduled basis.● Used the Datastage Designer to develop processes for Extracting, Cleansing, Transforming, Integrating, and Loading data into Data warehouse. ● Requirement analysis and gathering to provide technical and architectural support to the team. ● Documented the Purpose of Mapping to facilitate the personnel to understand the process and incorporate the changes as and when necessary. ● Developed various Server and Parallel jobs using Oracle, ODBC, FTP, Peek, Aggregator, Filter, Funnel, Copy, Hash File, Change Capture, Merge, Look up, Join, Sort, Merge, Lookup stages. ● Developed PL/SQL Procedures, Functions, Packages, Triggers, Normal and Materialized Views. ● Worked with the Reporting team for extensively reporting using Data mart for Slice & Dice, Drill Down and Drill through. ● Defect Tracking, unit testing, defect reporting, analyzing results and documentation. ● Designed DataStage parallel jobs using designer to extract data from various source systems, Transformation and conversion of data, Load data-to-data warehouse and Send data from warehouse to third party systems like Mainframe. ● Designed jobs Using complex stages like Transformer, Change Capture, Change Apply, Remove Duplicates, Join, Lookup, Merge, Funnel and Aggregator ● Performed ETL Performance tuning to increase the ETL process speed. ● Have addressed production, UAT issues, proper action was taken accordingly based on priority and requirement ● Associated with other team workers for implementation of data stage best practices for better performance. Show less

    • DMV Department of Motor Vehicles

      May 2013 - Jul 2016
      ETL Developer

      ● Leverage AWS cloud services such as EC2, auto-scaling and VPC to build secure highly scalable and flexible systems that handled expected and unexpected load bursts. ● Create and manage IAM user accounts and role-based policies for access to AWS services. ● Implement and maintain monitors, alarms, and notifications for EC2 instances using Cloud Watch and SNS. ● Perform S3 buckets creation, access policies, and archive outdated data to Glacier through Lifecycle Policy configuration● Active participation in decision-making and QA meetings and regularly interacted with the Business Analysts & development team to gain a better understanding of the Business Process, Requirements & Design. ● Used Pentaho KETTLE as an ETL tool to extract data from sources systems, loaded the data into the ORACLE & Maria DB database. ● Created Kettle jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc. ● Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek for development and debugging purposes. ● Used both spoon & carte to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions on ad hoc or scheduled basis. ● Developed complex stored procedures using input/output parameters, cursors, views, triggers, and complex queries using temporary tables and joins. ● Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis. ● Developed Test Plan that included the scope of the release, entrance and exit criteria and overall test strategy. Created detailed Test Cases and Test sets and executed them manually. ● Show less

    • Flagship Credit Acceptance

      Sept 2016 - Apr 2018
      Lead Infomatica Developer

      Responsibilities:● In depth knowledge of Software Development Lifecycle (SDLC) with understanding of various phases such as Requirements, Analysis, Development, testing, implementation and documentation and maintenance. ● Extensive experience in data mapping, business process, workflow design and development, trading partner setup, Integration, configuration, testing and validation, partner deployment & production support.● Owned the design, development and maintenance of Data integration mappings and its artifacts. ● Extensive experience in Business Analysis to Translate EDI Requirements into Functional Show less

    • MIO Partners, Inc.

      Apr 2018 - Feb 2019
      Lead Informatica DWH

      MIO Partners, Inc. (MIO) provides investment options and advice to McKinsey & Company's (McKinsey) pension plans, and to current and former McKinsey partners. We hire third-party managers to invest the majority of our assets under management. We also invest directly in macro asset classes, and provide the retirement plan with access to passively managed assets. Additionally, we provide wealth management advice to current and former McKinsey partners.Responsibilities● Integrated data into the database from different sources by creating SSIS packages.● Integrating new vendors like pluralsight, into the One HR/ One LnD SSAS model for reporting features.● Redesigning the existing models/creating new models based on the requirements to enhance the reporting capabilities for faster and efficient reporting scenarios.● Creating Power BI reports based on location, SBU per business requirements.● Performed encryption & decryption of source files to maintain data confidentiality.● Created tables and stored procedures in different layers of database based on the business requirement.● Involved in the process of creating the SSAS model.● Created New Columns and New Measures over data models for the purpose of enhancing Power BI visuals using DAX.● Created rich reports by using various visual objects like Table, Line chart,Bar chart, Single card, Map, Pie chart etc.● Implemented Drilldown and Drill through functionality.● Applied visual level, page level,and report level filters for restricting specific values.● Created bookmarks for navigating from one visual to another visual or one page to another page.● Created App with required reports and dashboards and shared the App to different users and security groups.● Publishing reports to Power BI service Show less

    • Fidelity Investments

      Apr 2018 - now
      ETL /Snowflake Lead

      Responsibilities● Integrated data into the database from different sources by creating SSIS packages.● Integrating new vendors like pluralsight, into the One HR/ One LnD SSAS model for reporting features.● Redesigning the existing models/creating new models based on the requirements to enhance the reporting capabilities for faster and efficient reporting scenarios.● Creating Power BI reports based on location, SBU per business requirements.● Performed encryption & decryption of source files to maintain data confidentiality.● Created tables and stored procedures in different layers of database based on the business requirement.● Involved in the process of creating the SSAS model.● Created New Columns and New Measures over data models for the purpose of enhancing Power BI visuals using DAX.● Created rich reports by using various visual objects like Table, Line chart,Bar chart, Single card, Map, Pie chart etc.● Implemented Drilldown and Drill through functionality.● Applied visual level, page level,and report level filters for restricting specific values.● Created bookmarks for navigating from one visual to another visual or one page to another page.● Created App with required reports and dashboards and shared the App to different users and security groups.● Publishing reports to Power BI service and providing the access to required users.● Created SQL Server agent jobs to automate and schedule SSIS packages execution.● Participated in support activities like jobs monitoring and resolving failures, resolved data related issues in reports.● Created internal and external stages and transformed data during load.● Redesigned the Views in snowflake to increase the performance.● Experience in working with AWS, Azure and Google data services● Created ETL Mappings to populate the data into dimensions and fact tables.Environment: Snowflake, Redshift, SQL server, AWS, AZURE, TALEND, JENKINS and SQL Show less

  • Licenses & Certifications