Close Menu
  • Home
  • AI
  • Big Data
  • Cloud Computing
  • iOS Development
  • IoT
  • IT/ Cybersecurity
  • Tech
    • Nanotechnology
    • Green Technology
    • Apple
    • Software Development
    • Software Engineering

Subscribe to Updates

Get the latest technology news from Bigteetechhub about IT, Cybersecurity and Big Data.

    What's Hot

    Santa Claus doesn’t exist (according to AI) • Graham Cluley

    December 28, 2025

    ios – Background Assets Framework server connection problem

    December 27, 2025

    FaZe Clan’s future is uncertain after influencers depart

    December 27, 2025
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Big Tee Tech Hub
    • Home
    • AI
    • Big Data
    • Cloud Computing
    • iOS Development
    • IoT
    • IT/ Cybersecurity
    • Tech
      • Nanotechnology
      • Green Technology
      • Apple
      • Software Development
      • Software Engineering
    Big Tee Tech Hub
    Home»Artificial Intelligence»Fine-tuning LLMs with user-level differential privacy
    Artificial Intelligence

    Fine-tuning LLMs with user-level differential privacy

    big tee tech hubBy big tee tech hubMay 27, 20250172 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
    Follow Us
    Google News Flipboard
    Fine-tuning LLMs with user-level differential privacy
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    Making these algorithms work for LLMs

    If we run these algorithms “out-of-the-box” for LLMs, things go badly. So, we came up with optimizations to the algorithms that fix the key issues with running them “out-of-the-box”.

    For ELS, we had to go from example-level DP guarantees to user-level DP guarantees. We found that previous work was adding orders of magnitude more noise than was actually necessary. We were able to prove that we can add significantly less noise, making the model much better while retaining the same privacy guarantees.

    For both ELS and ULS, we had to figure out how to optimize the contribution bound. A “default” choice is to choose a contribution bound that every user already satisfies; that is, we don’t do any pre-processing. However, some users may contribute a large amount of data, and we will need to add large amounts of noise to provide privacy to these users. Setting a smaller contribution bound reduces the amount of noise we need to add, but the cost is having to discard a lot of data. Because LLM training runs are expensive, we can’t afford to try training a bunch of models with different contribution bounds and pick the best one — we need an effective strategy to pick the contribution bound before we start training.

    After lengthy experimentation at scale, for ELS we found that setting the contribution bound to be the median number of examples held by each user was an effective strategy. For ULS, we give a prediction for the total noise added as a function of the contribution bound, and found that choosing the contribution bound minimizing this prediction was an effective strategy.



    Source link

    differential finetuning LLMs Privacy userlevel
    Follow on Google News Follow on Flipboard
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    tonirufai
    big tee tech hub
    • Website

    Related Posts

    This tiny chip could change the future of quantum computing

    December 27, 2025

    Why Enterprise AI Scale Stalls

    December 26, 2025

    Why Your Company Should Invest in AI Training for Employees Now

    December 25, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Santa Claus doesn’t exist (according to AI) • Graham Cluley

    December 28, 2025

    ios – Background Assets Framework server connection problem

    December 27, 2025

    FaZe Clan’s future is uncertain after influencers depart

    December 27, 2025

    Airbus prepares tender for European sovereign cloud

    December 27, 2025
    About Us
    About Us

    Welcome To big tee tech hub. Big tee tech hub is a Professional seo tools Platform. Here we will provide you only interesting content, which you will like very much. We’re dedicated to providing you the best of seo tools, with a focus on dependability and tools. We’re working to turn our passion for seo tools into a booming online website. We hope you enjoy our seo tools as much as we enjoy offering them to you.

    Don't Miss!

    Santa Claus doesn’t exist (according to AI) • Graham Cluley

    December 28, 2025

    ios – Background Assets Framework server connection problem

    December 27, 2025

    Subscribe to Updates

    Get the latest technology news from Bigteetechhub about IT, Cybersecurity and Big Data.

      • About Us
      • Contact Us
      • Disclaimer
      • Privacy Policy
      • Terms and Conditions
      © 2025 bigteetechhub.All Right Reserved

      Type above and press Enter to search. Press Esc to cancel.