Anil Kolla

Anil Kolla

Business Intelligence Developer

Followers of Anil Kolla2000 followers
location of Anil KollaDallas, Texas, United States

Connect with Anil Kolla to Send Message

Connect

Connect with Anil Kolla to Send Message

Connect
  • Timeline

  • About me

    Technical Lead @ Microsoft | Certified AZ-400, DP-203, DP-200 | Databricks, ADF, Data-lake, DEV OPS, YAML, TerraForm, AKS (Kubernetes), Docker, Synapse, POWER-BI, Power Platform (Power Automate and Power APP).

  • Education

    • Jawaharlal Nehru Technological University

      2005 - 2009
      Bachelor of Technology - BTech Electrical, Electronics and Communications Engineering A
    • Middlesex University

      2009 - 2010
      Master's degree Telecommunications Engineering B

      Activities and Societies: Chess, Cricket and Volleyball

  • Experience

    • WDPF

      Dec 2010 - Mar 2014
      Business Intelligence Developer

      Am a beginner in T-SQl, SSIS, SSAS and SSRS. Developed ETL logics using MSBI and Informatica tools and reports using SSRS.

    • Visionet Systems Inc. Bank Of America

      Apr 2014 - Oct 2015
      Senior Software Engineer

      I was a Sole MSBI (SSIS, SSAS, SSRS) lead developer where I used to gather the business requirement and create Technical Design out of the requirement and develop the ETL process paralelly.

    • Bosch

      Nov 2015 - Oct 2020
      Senior BI Developer / Lead

      • Developed U-SQL scripts with aggregation columns and produced monthly and daily usage summaries.• Generated the Cosmos streams based on the requirement like adding new columns, Changing the Join conditions etc. • Extensively worked with Azure Data Lake Layer and generated U-SQL Scripts.• Implemented PySpark framework using Azure Data Bricks.• PySpark language used for data processing, data aggregations, data cleansing by reading files from ADLS.• Created Linked Services, Datasets, Pipelines in Azure Data Factory and scheduled these Pipelines.• Experienced in Cosmos and ADL jobs and monitored the jobs from the Cosmos and ADL Job windows.• Downloaded the data from Cosmos to SQL server using FastCopy.exe.• Worked on SSAS Tabular Model and involved in modifying the existing Cube structure. • Involved on Power BI report changes and modified the existing Power BI Reports.• Involved in Performance tuning like Creating Partitions and applying data compression techniques for the huge volume data tables.• Involved architecture of data movement between the layers and proposed solutions.• I am part of the team supports all Production/DEV-UAT Support side & maintaining about 3000 plus servers and ensures that all the applications running on them are online 24*7 and troubleshoots the issues and SQL Server. • Lead the migration efforts from old datacenter to new datacenter from the offshore.• Handled incident reduction project.• On- call support along with core SQL DBA activities.• Handling changes and critical E-Commerce applications. • Database Performance tuning.Environment: Azure Data Factory, Azure Data Lake Analytics, Azure Data Bricks, U-SQL and Azure SQL DWH, SQL DBA Show less

    • Microsoft

      Nov 2016 - Dec 2019

      • Lead a team of 10. Created user stories and estimated the timelines.• Look ahead to determine the potential obstacles and found ways to work around them.• Guided team members to help them be successful and advance in the development.• Participated in review meeting to give updates to stake holders.• Analyze business requirements and implement in Microsoft BI Stack • Provide Technical / Functional support to Team. • Design and implement ETL for different banking/mortgage Data Flows. • Understand manually work of Business Analysts and design efficient solution so that they can view reports in more easy way • Fetching data from REST API and Load the data to Azure SQL DB.• Create DataBricks NoteBooks using PY Spark (Python) to process files from Silver layer to Bronze layer and Gold Layer.• Making data available to all the resources using the High Availability techniques.• Did initial analysis, assign the tasks to team members. Worked with team, get work done within timelines. • Extensively worked with Azure Data Lake Layer and generated Azure Data Bricks Scripts (PY Spark (Python)).• Created Linked Services, Datasets, Pipelines in Azure Data Factory and scheduled these Pipelines. • Involved on SSRS report changes and modified the existing SSRS Reports.• Involved in Performance tuning like Creating Partitions and applying data compression techniques for the huge volume data tables.• Involved architecture of data movement between the layers and proposed solutions.• I am part of the team supports all Production/DEV-UAT Support side & maintaining about 3000 plus servers and ensures that all the applications running on them are online 24*7 and troubleshoots the issues and SQL Server.• On- call support along with core SQL DBA activities.• Handling changes and critical ford plant applications. • Database Performance tuning.Environment: Azure Data Factory, Azure Data Lake Analytics, Azure Data Bricks, PySpark (Python) and Azure SQL DWH Show less • Lead the Project with 30 resources, estimated timelines, designed the solution.• Evaluated deliverables and provided suggestions for improving the solutions.• Lead the project for a diamond client where the application predicts the Down time model of the Plant thereby increasing efficiency. • Provide Technical / Functional support to Team. • Participated in review meetings to give updates to stake holders.• Design and implement ETL for streaming Data Flows. • Created Azure Data Factory to load data from JSON files (In BLOB) and SQL Server on premises database to Azure Blob and SQL • Created Linked Services, Datasets, Pipelines in Azure Data Factory and scheduled these Pipelines. • Created Dashboards in powerBI with Tachometer, Slicers, Bookmarks with Buttons using 1TB data from Azure SQL Server and published to PowerBI service • Created PowerBI Dashboards with the data to report the UP and Down time PREDICTIONS of the Plant.• I am part of the team supports all Production/DEV-UAT Support side & maintaining about 3000 plus servers and ensures that all the applications running on them are online 24*7 and troubleshoots the issues and SQL Server. • Used Apache Air flow to Author, schedule and monitor workflows.• Created Air flow pipelines to process real time data which pulls data off streams in batches.• Lead the migration efforts from old datacenter to new datacenter from the offshore.• Database Performance tuning.• On- call support along with core SQL DBA activities. Environment: Azure Data Factory, Azure SQL DWH, Event Hubs, Power BI, Data Science Show less • As a team lead, understand the requirements and worked with modelers to get information each model.• Did initial analysis, assign the tasks to team members. Worked with team, get work done within timelines. • Created generic pipeline to load data for each model irrespective of requirement by creating individual azure data bricks pyspark.• Developed Pyspark scripts with aggregation columns and produced monthly and daily usage summaries by pivoting data. • Extensively worked with Azure Data Lake Layer and generated Azure Data Bricks Scripts.• Fetching data from REST API and Load the data to Azure SQL DB.• Created Linked Services, Datasets, Pipelines in Azure Data Factory and scheduled these Pipelines.• Experienced in Cosmos and ADL jobs and monitored the jobs from the Cosmos and ADL Job windows.• Downloaded the data from Cosmos to SQL server using FastCopy.exe.• Worked on SSAS Tabular Model and involved in modifying the existing Cube structure. • Involved on Power BI report changes and modified the existing Power BI Reports.• Involved in Performance tuning like Creating Partitions and applying data compression techniques for the huge volume data tables.Environment: Azure Data Factory, Azure Data Lake Analytics, Azure Data Bricks, PySpark and Azure SQL DWH Show less

      • Senior Azure Data Engineer / Lead : DMJ

        Apr 2019 - Dec 2019
      • Lead Data Engineer : Debswana

        Jan 2018 - Mar 2019
      • Senior Azure Data Engineer : TXWIC

        Nov 2016 - Dec 2018
    • Deloitte

      Jan 2020 - Oct 2020
      Data Engineering Consultant / Lead

      • Created packages in SSIS Designer using Control Flow Tasks and Data Flow Transformations to implement Business Rules.• Transformed data from various data sources using OLE DB connection by creating various SSIS Packages.• Creating the Mappings using Data Flow Transformations such as the Sort, Derived Column, Conditional Split, SCD, Pivot and Lookup etc.• Using Control Flow elements like Containers, Execute SQL Task, Execute Package Task, File System Task and Send Mail Task etc.• Debugging and Validating the Data flows and Control Flows. • To Develop Exception Handling process for each SSIS Package.• Design and development of database scripts and procedures.• Implemented check point configuration, package configuration in packages for better reusability.• Created pipelines in Azure Data Factory and integrated SSIS packages in Azure environment.• Implemented Batch orchestration for multiple SSIS packages using ADF pipelines.• Backup and Restore the Databases. Jobs Monitoring and troubleshooting.• Monitoring the SQL Server error logs.• Developed administrative processes and procedures to monitor database systems. • Ensure all the Production servers are up to date with respect to patches and hot fixes.• Assigning user permissions like Granting, Denying and Revoking permissions• Troubleshooting problems related to connectivity to the databases, database job failures, and Backup failures.• Database Backups & Restoration, Backup Strategies, Scheduling Backups. Backing up master & system databases and restoring them when necessary.• Making data available to all the resources using the High Availability techniques.• Troubleshooting of the database activity to maintain the accuracy and integrity. • Using Database Transformation Services for Import as well as Export of Data to & from Other Applications being used by the Organization Show less

    • Providence

      Nov 2020 - Feb 2023
      Technical Lead Data Engineer / TPM

      • Requirements gathering and Technical design for Finance and building the new BI solution for each project using DataFactory, Data Bricks and PowerBI reports.• Building the sharepoint list/Documents in order to support as a mid-layer to source and destination in fetching Headcount.• Developing, Deploying and Managing one solution for different Finance modules.• Developing, Publishing and managing Power Apps (Nominations Web page) and Power Flow (live trigger when an event is created or modified).• Data Governance and Data Quality analysis. • Design and Architecture the workflow to read data from Finance system to Azure Application.• Implement data orchestrations to migrate data from On-premises to cloud using Azure Synapse Workspace.• Identify and enable the required azure resources and setting up environments with Infra team. • Analyze business requirements and implement in Microsoft BI Stack for supply chain / HR / Finance/ Clinical domain.• Coordination with source team for data mapping and prepare the mapping documents.• Provide solutions to read data from Lawson/Workday/Project Online systems/ORACLE DATAStore, set up the environment like Azure Data Lake store, Azure Synapse workspace and SNOWFLAKE And Logic Apps to Automate.• Fetching data from REST API and Load the data to Azure SQL DB and SNOWFlake.• Create skeletons to data processing in Apache spark pool in Azure Synapse and create the reusable code to clone or mount the ADLS into synapse.• Created the synapse pipelines to fetch data from REST API and load to Data Lake.• Created synapse SQL scripts on Serverless SQL Server to read and process data from DataLake files(Parquet).• Create DataBricks NoteBooks using PY Spark (Python) to process files from Silver layer to Bronze layer and Gold Layer.• Communicate with Business Analysts, gather business requirements, and business users expectations. Show less

    • Microsoft

      Apr 2023 - now
      Lead Data Engineer

      • Design the workflow to read data from CRM system and load to Data Lake using Data Factory.• Identify and enable the required azure resources and setting up environments with Infra team.• Coordination with source team for data mapping and prepare the mapping documents.• Set up the environment for Azure Data Lake store, DataBricks, Azure Synapse workspace, PowerBI.• Fetching data from DataLake using DataBricks workspace (T-SQL commands) and transforming the data from one layer to the other.• Mounting the DataLake to Databricks using service principle and key vault.• Create the required Active Directories and provide required access for members in respective environments like Production and Non-production environments.• Communicate with Business Analysts, gather business requirements, and business users expectations.• Create modelling for Star schema and create tables and load data to tables in DataBricks using DataBricks Notebooks(T-SQL commands).• Create modelling for Aggregated tables STAR schema feasible for reporting in PowerBI.• Modelling and Creating complex dashboards, Bookmark reports, paginated reports in PowerBI.• Implement data orchestrations to migrate data from DataBricks (Data Lake) to Azure Synapse Workspace using Data Factory.• Creating User stories and related tasks in Dev Ops boards.• Build CI/CD pipelines from DEV to QA and PROD in DEV Ops. Show less

  • Licenses & Certifications

    • DP-200 Implementing an Azure Data Solution

      Microsoft
      Jul 2020
      View certificate certificate
    • Certified TigerGraph Associate Exam

      TigerGraph
      Mar 2023
      View certificate certificate
    • Analyzing and Visualizing Data with Microsoft Power BI

      Microsoft
      Nov 2019
    • Exam 778: Analyzing and Visualizing Data with Power BI

      Microsoft
      Nov 2019
      View certificate certificate
    • AI-100 : Designing and Implementing an Azure AI Solution

      Microsoft
      Jun 2019
    • Microsoft Certified: Azure Administrator Associate

      Microsoft
      Dec 2024
      View certificate certificate
    • DP-200: Implementing an Azure Data Solution

      Microsoft
      Jul 2020
    • Academy Accreditation - Generative AI Fundamentals

      Databricks
      Nov 2025
      View certificate certificate
    • Microsoft Certified: Azure Data Engineer Associate

      Microsoft
      Jan 2023
      View certificate certificate
    • Microsoft Certified: DevOps Engineer Expert

      Microsoft
      Dec 2024
      View certificate certificate