Shanna Lovelace

Shanna Lovelace

Data Administrator

Followers of Shanna Lovelace440 followers
location of Shanna LovelacePortland, Maine Metropolitan Area

Connect with Shanna Lovelace to Send Message

Connect

Connect with Shanna Lovelace to Send Message

Connect
  • Timeline

  • About me

    Data Engineer | Analytics Engineer | ETL/ELT | SQL | Python | dbt (data build tool) | Databricks | Snowflake | GitHub

  • Education

    • University of Alaska Fairbanks

      2006 - 2009
      Associate of Science (A.S.) Information Technology
    • University of Southern Maine

      2010 - 2014
      Bachelor's degree Biology, General

      Activities and Societies: Golden Key International Honour Society

    • Casper College

      2001 - 2004
      Associate of Arts (A.A.) History
  • Experience

    • New England Cancer Specialists

      Jun 2017 - Oct 2019
      Data Administrator

      Provided comprehensive support and maintenance for bidirectional EMR data interfaces, effectively managing connections among multiple practice management systems. Performed meticulous data analysis and facilitated timely submissions to government and private institutions for accurate/compliant data reporting and fulfilling regulatory requirements effectively.- Managed detailed data administration tasks, including troubleshooting, development of new workflows, and creation of detailed process documentation.- Collaborated closely with HIPAA compliance officer and engaged external security firm for ensuring strict adherence to data security compliance measures, safeguarding sensitive information, and mitigating potential risks effectively. Show less

    • MaineHealth

      Nov 2019 - Aug 2021
      Data Warehouse Analyst

      Oversaw management of internal/external data sources and destinations, including EMR, API integrations and cloud-based databases for ensuring seamless data flow and accessibility across various systems. Diagnosed and resolved complex issues pertaining to multiple data warehouses, employing advanced troubleshooting techniques for seamless operations and maintaining data integrity.- Developed robust ETL SSIS packages using SQL and Python to cater to diverse requirements of stakeholders, involving internal reporting, marketing initiatives, financial data analysis, and regulatory submissions for efficient data processing and integration.- Conducted CI/CD to explore innovative data processing methods, implementing cutting-edge techniques for enhancing functionality, optimizing performance, and maintaining abreast knowledge of latest industry advancements. Show less

    • OM1, Inc.

      Aug 2021 - May 2023
      Clinical Data Analytics Engineer

      Developed robust end-to-end analytics solutions by seamlessly amalgamating structured and unstructured data from diverse sources, such as files via AWS S3, EMR interfaces, HL7, FHIR, APIs, and internally developed applications, ensuring enhanced data integration and accessibility. Gained proficiency in designing, constructing, and thoroughly validating end-to-end ETL/ELT pipelines utilizing Docker and dbt for seamless data flow from source to destination. Leveraged Tableau to construct visually engaging data visualizations for enhancing quality control measures and facilitating effective project communication. Crafted data models in line with current requirements while accounting for future scalability, optimal organization, and adaptability to evolving needs.- Led successful development of the cutting-edge MS-LINK nationwide registry by integrating data from EMR and internally developed portals, overcoming staff shortage and leveraging expertise in SQL, Python, dbt, Snowflake, Databricks,, and Scala.- Utilized SQL, Python, and Scala in Snowflake and Databricks to meticulously create tailored datasets for machine learning initiatives such the Dartmouth Next Generation Learning Health System for Multiple Sclerosis for improving patient outcomes, financial planning, and predictive insights with focus on driving significant improvements. - Developed bespoke solutions to load, transform, and integrate billions of claims into OM1's data store by overcoming data size limitations, optimizing Snowflake environment, collaborating with stakeholders, and multitasking with MS-LINK project.- Maintained effective collaboration with cross-functional team members, including Chief Financial Officer, legal counsel, and ML engineers to scope and prioritize business requirements for projects.- Evaluated and approved vendors based on cost, integration, scalability, and quality while rejecting inadequate services, limited scalability, hidden fees, and poor customer support. Show less

    • Shift5

      Sept 2023 - now
      Analytics Engineer

      Developing and maintaining custom data models from the data pipeline for multiple protocols and platforms. Ensuring the scalability and extensibility of data modeling and pipeline code. Developing and performing data integrity and quality testing at each step of data loading and transformation. Taking ownership of troubleshooting and fixing issues with data loading and transformations. Collaborating with and keeping good communication with cross-functional teams such as Hardware, Product, Platform, Field Engineering, and Data Engineering to ensure data solutions align with and further Shift5’s mission to ensure smarter and safer fleets. Creating documentation for internal and external stakeholders.- Developed the bespoke AWS to Snowflake/Databricks medallion data modeling that loads and links different platform protocol data from raw datasets with the ability to scale for more protocols and platforms. - Updating code and creating new solutions to improve AWS to data platform loading and transformations for better efficiency and reliability including more macros for repeated code and automatic updating based on translation data from upstream data teams.- Oversaw data pipeline and warehouse migration from Snowflake to Databricks including updating dbt models, changing pipeline configurations, and data reloads. - Continued maintenance and updating code based on stakeholder requests and pipeline changes including requests for new columns and pipeline nomenclature updates in quarterly releases.- Continuously testing for data quality and integrity and pipeline congruity with custom SQL and built-in dbt testing with plans to expand testing using Python and notebooks in Databricks.- Creating and maintaining pipeline and code documentation using dbt data dictionaries, Python data dictionary code, Confluence pages, code commenting, and documents for internal and external stakeholders. Show less

  • Licenses & Certifications

  • Volunteer Experience

    • Volunteer Peer Guide and Animal Caregiver

      Issued by Animal Refuge League of Greater Portland on Feb 2016
      Animal Refuge League of Greater PortlandAssociated with Shanna Lovelace