Close Menu
  • Home
  • AI
  • Big Data
  • Cloud Computing
  • iOS Development
  • IoT
  • IT/ Cybersecurity
  • Tech
    • Nanotechnology
    • Green Technology
    • Apple
    • Software Development
    • Software Engineering

Subscribe to Updates

Get the latest technology news from Bigteetechhub about IT, Cybersecurity and Big Data.

    What's Hot

    Non-Abelian anyons: anything but easy

    January 25, 2026

    Announcing Amazon EC2 G7e instances accelerated by NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs

    January 25, 2026

    Tech CEOs boast and bicker about AI at Davos

    January 25, 2026
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram
    Big Tee Tech Hub
    • Home
    • AI
    • Big Data
    • Cloud Computing
    • iOS Development
    • IoT
    • IT/ Cybersecurity
    • Tech
      • Nanotechnology
      • Green Technology
      • Apple
      • Software Development
      • Software Engineering
    Big Tee Tech Hub
    Home»Artificial Intelligence»AI may not need massive training data after all
    Artificial Intelligence

    AI may not need massive training data after all

    big tee tech hubBy big tee tech hubJanuary 6, 2026003 Mins Read
    Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
    Follow Us
    Google News Flipboard
    AI may not need massive training data after all
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    New research from Johns Hopkins University shows that artificial intelligence systems built with designs inspired by biology can begin to resemble human brain activity even before they are trained on any data. The study suggests that how AI is structured may be just as important as how much data it processes.

    The findings, published in Nature Machine Intelligence, challenge the dominant strategy in AI development. Instead of relying on months of training, enormous datasets, and vast computing power, the research highlights the value of starting with a brain-like architectural foundation.

    Rethinking the Data Heavy Approach to AI

    “The way that the AI field is moving right now is to throw a bunch of data at the models and build compute resources the size of small cities. That requires spending hundreds of billions of dollars. Meanwhile, humans learn to see using very little data,” said lead author Mick Bonner, assistant professor of cognitive science at Johns Hopkins University. “Evolution may have converged on this design for a good reason. Our work suggests that architectural designs that are more brain-like put the AI systems in a very advantageous starting point.”

    Bonner and his colleagues aimed to test whether architecture alone could give AI systems a more human-like starting point, without relying on large-scale training.

    Comparing Popular AI Architectures

    The research team focused on three major types of neural network designs commonly used in modern AI systems: transformers, fully connected networks, and convolutional neural networks.

    They repeatedly adjusted these designs to create dozens of different artificial neural networks. None of the models were trained beforehand. The researchers then showed the untrained systems images of objects, people, and animals and compared their internal activity to brain responses from humans and non-human primates viewing the same images.

    Why Convolutional Networks Stood Out

    Increasing the number of artificial neurons in transformers and fully connected networks produced little meaningful change. However, similar adjustments to convolutional neural networks led to activity patterns that more closely matched those seen in the human brain.

    According to the researchers, these untrained convolutional models performed on par with traditional AI systems that typically require exposure to millions or even billions of images. The results suggest that architecture plays a larger role in shaping brain-like behavior than previously believed.

    A Faster Path to Smarter AI

    “If training on massive data is really the crucial factor, then there should be no way of getting to brain-like AI systems through architectural modifications alone,” Bonner said. “This means that by starting with the right blueprint, and perhaps incorporating other insights from biology, we may be able to dramatically accelerate learning in AI systems.”

    The team is now exploring simple learning methods inspired by biology that could lead to a new generation of deep learning frameworks, potentially making AI systems faster, more efficient, and less dependent on massive datasets.



    Source link

    Data massive Training
    Follow on Google News Follow on Flipboard
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
    tonirufai
    big tee tech hub
    • Website

    Related Posts

    Balancing cost and performance: Agentic AI development

    January 24, 2026

    Data Engineer Roadmap 2026: 6-Month Learning Plan

    January 24, 2026

    Data and Analytics Leaders Think They’re AI-Ready. They’re Probably Not. 

    January 23, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Non-Abelian anyons: anything but easy

    January 25, 2026

    Announcing Amazon EC2 G7e instances accelerated by NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs

    January 25, 2026

    Tech CEOs boast and bicker about AI at Davos

    January 25, 2026

    How Content Management Is Transforming Construction ERP

    January 25, 2026
    About Us
    About Us

    Welcome To big tee tech hub. Big tee tech hub is a Professional seo tools Platform. Here we will provide you only interesting content, which you will like very much. We’re dedicated to providing you the best of seo tools, with a focus on dependability and tools. We’re working to turn our passion for seo tools into a booming online website. We hope you enjoy our seo tools as much as we enjoy offering them to you.

    Don't Miss!

    Non-Abelian anyons: anything but easy

    January 25, 2026

    Announcing Amazon EC2 G7e instances accelerated by NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs

    January 25, 2026

    Subscribe to Updates

    Get the latest technology news from Bigteetechhub about IT, Cybersecurity and Big Data.

      • About Us
      • Contact Us
      • Disclaimer
      • Privacy Policy
      • Terms and Conditions
      © 2026 bigteetechhub.All Right Reserved

      Type above and press Enter to search. Press Esc to cancel.