Topview Logo
  • Create viral videos with
    GPT-4o + Ads library
    Use GPT-4o to edit video empowered by Youtube & Tiktok & Facebook ads library. Turns your links or media assets into viral videos in one click.
    Try it free
    gpt video

    Improving Underwater Mine Detection with MLOps

    blog thumbnail

    Introduction

    Good afternoon everyone! My name is Georgia, and on behalf of Kosol Technology, I would like to welcome you to our Domino Data Lab, Fiddler AI, and Weights & Biases webinar focused on applying AI to government solutions. Today, we will be showcasing our third session about improving underwater mine detection using MLOps.

    Introduction of Speakers

    First, I would love to introduce our speakers for today's presentation:

    • Christopher Elons: Director of Public Sector Go-to-Market at Domino Data Lab
    • Josh Rubin: Principal Scientist at Fiddler AI
    • Kevin Stofen: Solutions Engineer at Weights & Biases

    With that, I'll turn it over to our speakers of the day: Chris, Josh, and Kevin. The floor is yours!

    Overview of Domino Data Lab

    Thanks, Georgia! I appreciate everyone’s attendance here today. I am Christopher Elons, the Director of Public Sector Go-to-Market at Domino Data Lab. I've been with the organization for about four years, leading our initiatives in both civilian and intelligence sectors.

    To kick things off, I would like to provide a quick overview of the Domino Data Lab platform capabilities. Throughout today's presentation, you will learn about an integrated solution developed by the three vendors, along with two others not present, to create a modern MLOps pipeline for the Navy. This pipeline enables the deployment of models on Unmanned Underwater Vehicles (UUVs) at the edge, including maintaining and redeploying those models over time.

    Capabilities of Domino Data Lab

    Domino is an AI/ML ops platform designed to streamline the entire model lifecycle. This includes:

    • Data Access: Empowering data scientists and ML developers to access necessary data.
    • Model Training: Allowing for the training and testing of models.
    • Deployment: Facilitating the deployment of models to various systems including API endpoints, edge devices, etc.
    • Monitoring and Remediation: Enabling the constant monitoring of these models and allowing for their remediation and redeployment.

    Some of the ways in which we differentiate ourselves include:

    1. Open Architecture: Enabling developers to utilize any preferred tools (Python, R, etc.).
    2. Deployment Flexibility: The solution can reside in various environments—commercial cloud, government cloud, or on-premise.
    3. Interoperability: Our platform integrates seamlessly with high-quality capabilities from commercial vendors.
    4. Extensibility: Being an API-first platform, we can easily adapt and scale.

    Introduction to Weights & Biases

    Now I’d like to hand it over to Kevin Stofen from Weights & Biases.

    Kevin: Thank you! I’m excited to share insights from Weights & Biases, where we offer a suite of tools for collaboration among machine learning developers. Some of our successful tools include:

    • Experiment Tracking: Tying together all metrics and details from ongoing machine learning experiments.
    • Artifact Management: Automatically versioning datasets and models.
    • Model Registry: Facilitating sharing and tracking model lineage.

    We've also extended our tools to accommodate model CI/CD, enabling reproducible experiments across various computing environments.

    Introduction to Fiddler AI

    Next up, let’s hear from Josh Rubin at Fiddler.

    Josh: Thanks, Kevin! I want to highlight what Fiddler does—our focus is on production observability for predictive and generative AI. The Fiddler platform serves as a cockpit for telemetry data coming from AI models. It tracks model performance, correctness, and data drift while providing deep explainability capabilities.

    Through our engagement with the Navy for underwater mine detection, we've delivered essential tools to monitor performance and understand failure modes.

    Project Overview: Addressing Underwater Mine Detection Challenges

    To summarize, the Navy employs computer vision models for scanning underwater seabeds to detect mines. However, several challenges hindered deployment and retraining processes:

    • Developers worked in isolated environments, complicating data access.
    • Handoffs between development and production teams were cumbersome.
    • Models used in UUVs required extensive refactoring for optimal performance.
    • Lack of real-time monitoring made updates challenging.

    With our integrated capabilities, we drastically decreased the deployment time from six months to just two weeks and retraining from 12 months to two weeks. The new setup has laid a foundational framework applicable for future use cases by the Navy.

    MLOps Pipeline Flow

    This diagram illustrates the MLOps pipeline, detailing contributions from each vendor. Domino streamlines model development by supporting cloud-native tools, allowing developers to work in AWS GovCloud. Weights & Biases enhances the model registry for tracking iterative model development, while Fiddler provides the monitoring and explainability required in production.

    Common Challenges in the Public Sector

    As we broaden our scope, it’s important to discuss recurring challenges in the public sector. A collaborative environment bolstered by adaptive standards can enhance efficiencies and reduce bottlenecks experienced during model development.

    Conclusion

    As we conclude, a collaborative effort among vendors has demonstrated that implementing a modern MLOps pipeline can yield significant improvements. More so, an extensible framework can cater to various predictive and generative AI use cases beyond defense to civilian and intelligence sectors.


    Keywords

    • MLOps
    • Underwater Mine Detection
    • Domino Data Lab
    • Weights & Biases
    • Fiddler AI
    • Model Development
    • AI Observability
    • Public Sector Challenges
    • Cloud-Native Tools

    FAQ

    1. What is the role of Domino Data Lab in this project?
      Domino Data Lab streamlines the model lifecycle, facilitating data access, model training, deployment, and monitoring.

    2. How did Weights & Biases contribute to the project?
      They provided tools for experiment tracking, artifact management, and a model registry for tracking lineage and enabling collaboration.

    3. What challenges were faced in prior deployment practices?
      Challenges included isolated environments, inefficient handoffs between teams, cumbersome refactoring for UUVs, and a lack of real-time model monitoring.

    4. How was the time for deployment and retraining models improved?
      The integrated MLOps solution reduced deployment time from six months to two weeks and retraining time from 12 months to two weeks.

    5. What are the future implications of this project?
      The project establishes a framework applicable to various AI use cases, enhancing efficiency and allowing the Navy to quickly respond to new challenges.

    One more thing

    In addition to the incredible tools mentioned above, for those looking to elevate their video creation process even further, Topview.ai stands out as a revolutionary online AI video editor.

    TopView.ai provides two powerful tools to help you make ads video in one click.

    Materials to Video: you can upload your raw footage or pictures, TopView.ai will edit video based on media you uploaded for you.

    Link to Video: you can paste an E-Commerce product link, TopView.ai will generate a video for you.

    You may also like