Close Menu
  • Home
  • AI
  • Big Data
  • Cloud Computing
  • iOS Development
  • IoT
  • IT/ Cybersecurity
  • Tech
    • Nanotechnology
    • Green Technology
    • Apple
    • Software Development
    • Software Engineering

Subscribe to Updates

Get the latest technology news from Bigteetechhub about IT, Cybersecurity and Big Data.

    What's Hot

    android – Can’t show or schedule notification with flutter workmanager

    January 26, 2026

    How Teams Using Multi-Model AI Reduced Risk Without Slowing Innovation

    January 26, 2026

    A deep dive into Apple’s AI strategy reset, as it prepares to announce a Gemini-powered personalized Siri next month and a reimagined chatbot-like Siri at WWDC (Mark Gurman/Bloomberg)

    January 25, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Big Tee Tech Hub
    • Home
    • AI
    • Big Data
    • Cloud Computing
    • iOS Development
    • IoT
    • IT/ Cybersecurity
    • Tech
      • Nanotechnology
      • Green Technology
      • Apple
      • Software Development
      • Software Engineering
    Big Tee Tech Hub
    Home»Artificial Intelligence»Posit AI Blog: safetensors 0.1.0
    Artificial Intelligence

    Posit AI Blog: safetensors 0.1.0

    big tee tech hubBy big tee tech hubMay 30, 2025024 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
    Follow Us
    Google News Flipboard
    Posit AI Blog: safetensors 0.1.0
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    safetensors is a new, simple, fast, and safe file format for storing tensors. The design of the file format and its original implementation are being led
    by Hugging Face, and it’s getting largely adopted in their popular ‘transformers’ framework. The safetensors R package is a pure-R implementation, allowing to both read and write safetensor files.

    The initial version (0.1.0) of safetensors is now on CRAN.

    Motivation

    The main motivation for safetensors in the Python community is security. As noted
    in the official documentation:

    The main rationale for this crate is to remove the need to use pickle on PyTorch which is used by default.

    Pickle is considered an unsafe format, as the action of loading a Pickle file can
    trigger the execution of arbitrary code. This has never been a concern for torch
    for R users, since the Pickle parser that is included in LibTorch only supports a subset
    of the Pickle format, which doesn’t include executing code.

    However, the file format has additional advantages over other commonly used formats, including:

    • Support for lazy loading: You can choose to read a subset of the tensors stored in the file.

    • Zero copy: Reading the file does not require more memory than the file itself.
      (Technically the current R implementation does makes a single copy, but that can
      be optimized out if we really need it at some point).

    • Simple: Implementing the file format is simple, and doesn’t require complex dependencies.
      This means that it’s a good format for exchanging tensors between ML frameworks and
      between different programming languages. For instance, you can write a safetensors file
      in R and load it in Python, and vice-versa.

    There are additional advantages compared to other file formats common in this space, and
    you can see a comparison table here.

    Format

    The safetensors format is described in the figure below. It’s basically a header file
    containing some metadata, followed by raw tensor buffers.

    Diagram describing the safetensors file format.
    Diagram describing the safetensors file format.

    Basic usage

    safetensors can be installed from CRAN using:

    install.packages("safetensors")

    We can then write any named list of torch tensors:

    library(torch)
    library(safetensors)
    
    tensors <- list(
      x = torch_randn(10, 10),
      y = torch_ones(10, 10)
    )
    
    str(tensors)
    #> List of 2
    #>  $ x:Float [1:10, 1:10]
    #>  $ y:Float [1:10, 1:10]
    
    tmp <- tempfile()
    safe_save_file(tensors, tmp)

    It’s possible to pass additional metadata to the saved file by providing a metadata
    parameter containing a named list.

    Reading safetensors files is handled by safe_load_file, and it returns the named
    list of tensors along with the metadata attribute containing the parsed file header.

    tensors <- safe_load_file(tmp)
    str(tensors)
    #> List of 2
    #>  $ x:Float [1:10, 1:10]
    #>  $ y:Float [1:10, 1:10]
    #>  - attr(*, "metadata")=List of 2
    #>   ..$ x:List of 3
    #>   .. ..$ shape       : int [1:2] 10 10
    #>   .. ..$ dtype       : chr "F32"
    #>   .. ..$ data_offsets: int [1:2] 0 400
    #>   ..$ y:List of 3
    #>   .. ..$ shape       : int [1:2] 10 10
    #>   .. ..$ dtype       : chr "F32"
    #>   .. ..$ data_offsets: int [1:2] 400 800
    #>  - attr(*, "max_offset")= int 929

    Currently, safetensors only supports writing torch tensors, but we plan to add
    support for writing plain R arrays and tensorflow tensors in the future.

    Future directions

    The next version of torch will use safetensors as its serialization format,
    meaning that when calling torch_save() on a model, list of tensors, or other
    types of objects supported by torch_save, you will get a valid safetensors file.

    This is an improvement over the previous implementation because:

    1. It’s much faster. More than 10x for medium sized models. Could be even more for large files.
      This also improves the performance of parallel dataloaders by ~30%.

    2. It enhances cross-language and cross-framework compatibility. You can train your model
      in R and use it in Python (and vice-versa), or train your model in tensorflow and run it
      with torch.

    If you want to try it out, you can install the development version of torch with:

    remotes::install_github("mlverse/torch")

    Photo by Nick Fewings on Unsplash

    Enjoy this blog? Get notified of new posts by email:

    Posts also available at r-bloggers

    Reuse

    Text and figures are licensed under Creative Commons Attribution CC BY 4.0. The figures that have been reused from other sources don’t fall under this license and can be recognized by a note in their caption: “Figure from …”.

    Citation

    For attribution, please cite this work as

    Falbel (2023, June 15). Posit AI Blog: safetensors 0.1.0. Retrieved from 

    BibTeX citation

    @misc{safetensors,
      author = {Falbel, Daniel},
      title = {Posit AI Blog: safetensors 0.1.0},
      url = {},
      year = {2023}
    }



    Source link

    0.1.0 Blog Posit safetensors
    Follow on Google News Follow on Flipboard
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    tonirufai
    big tee tech hub
    • Website

    Related Posts

    The human brain may work more like AI than anyone expected

    January 25, 2026

    Balancing cost and performance: Agentic AI development

    January 24, 2026

    Take Action on Emerging Trends

    January 23, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    android – Can’t show or schedule notification with flutter workmanager

    January 26, 2026

    How Teams Using Multi-Model AI Reduced Risk Without Slowing Innovation

    January 26, 2026

    A deep dive into Apple’s AI strategy reset, as it prepares to announce a Gemini-powered personalized Siri next month and a reimagined chatbot-like Siri at WWDC (Mark Gurman/Bloomberg)

    January 25, 2026

    European Space Agency’s cybersecurity in freefall as yet another breach exposes spacecraft and mission data

    January 25, 2026
    About Us
    About Us

    Welcome To big tee tech hub. Big tee tech hub is a Professional seo tools Platform. Here we will provide you only interesting content, which you will like very much. We’re dedicated to providing you the best of seo tools, with a focus on dependability and tools. We’re working to turn our passion for seo tools into a booming online website. We hope you enjoy our seo tools as much as we enjoy offering them to you.

    Don't Miss!

    android – Can’t show or schedule notification with flutter workmanager

    January 26, 2026

    How Teams Using Multi-Model AI Reduced Risk Without Slowing Innovation

    January 26, 2026

    Subscribe to Updates

    Get the latest technology news from Bigteetechhub about IT, Cybersecurity and Big Data.

      • About Us
      • Contact Us
      • Disclaimer
      • Privacy Policy
      • Terms and Conditions
      © 2026 bigteetechhub.All Right Reserved

      Type above and press Enter to search. Press Esc to cancel.