Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

Benchmark raises $225 million in special funding to double Cerebras

AI startup founder says he plans a ‘March for Billionaires’ to protest California’s wealth tax

From Svedka to Anthropic, brands are boldly leveraging AI in their Super Bowl ads

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » New ‘dragon hatching’ AI architecture modeled after the human brain could be a key step towards AGI, researchers claim
Science

New ‘dragon hatching’ AI architecture modeled after the human brain could be a key step towards AGI, researchers claim

userBy userNovember 13, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Researchers have designed a new type of large-scale language model (LLM) that they propose can bridge the gap between artificial intelligence (AI) and more human-like cognition.

Researchers at AI startup Pathway, which developed the model, say the model, called “Dragon Hatching,” is designed to more accurately simulate how neurons in the brain connect and strengthen through learning experiences. They described this as the first model that can “generalize over time,” meaning it can automatically adjust its neural wiring in response to new information.

In the study, uploaded to the preprint database arXiv on September 30, the team assembled this model as a successor to existing architectures that underpin generative AI tools such as ChatGPT and Google Gemini. They further suggested that this model could provide the “missing link” between today’s AI technologies and more advanced brain-inspired intelligence models.

you may like

“There’s a lot of discussion going on right now, especially with inference models and synthetic inference models, about whether you can extend inference beyond the patterns we’ve seen in data retention, and whether you can generalize inference to more complex or longer inference patterns,” Adrian Kosowski, co-founder and chief scientific officer at Pathway, said on the Super Data Science podcast on October 7.

“The evidence is largely inconclusive and the answer is generally no. Currently, machines do not generalize reasoning the way humans do. We believe this is a major challenge. [the] The architecture we are proposing has the potential to bring about significant changes. ”

A step towards AGI?

Teaching AI to think like humans is one of the field’s most important goals. However, reaching this level of simulated cognition, often referred to as artificial general intelligence (AGI), remains difficult.

A key challenge is that human thinking is inherently messy. Our thoughts rarely appear as neat, linear sequences of connected information. Rather, the human brain is a chaotic tangle of overlapping thoughts, sensations, emotions, and impulses constantly competing for your attention.

Get the world’s most fascinating discoveries delivered straight to your inbox.

Diagram of a network of connected lines and points.

(Image credit: JESPER KLAUSEN / SCIENCE PHOTO LIBRARY/Getty Images)

In recent years, LLM has brought the AI ​​industry even closer to simulating human-like reasoning. LLMs are typically powered by Transformer models (Transformers), a type of deep learning framework that allows AI models to connect words and ideas during a conversation. Transformers are the “brains” behind generative AI tools like ChatGPT, Gemini, Claude, etc., allowing them to interact with and respond to users with (at least most of the time) convincing levels of “awareness.”

Transformers is extremely sophisticated, but it also represents the cutting edge of existing generative AI capabilities. One reason is that they don’t learn continuously. Once an LLM is trained, the parameters governing it are locked. This means that new knowledge must be added through retraining or fine-tuning. When LLM encounters something new, it simply generates a response based on what it already knows.

imagine dragon

Dragon Hatchling, on the other hand, is designed to dynamically adapt its understanding beyond the training data. It does this by updating its internal connections in real time as it processes new inputs, similar to how neurons strengthen or weaken over time. This may support continuous learning, the researchers said.

you may like

Unlike typical Transformer architectures, which process information sequentially through stacked layers of nodes, Dragon Hatchling’s architecture behaves like a flexible web that reorganizes itself as new information is revealed. Tiny “neuronal particles” continually exchange information and adjust their connections, strengthening some and weakening others.

Over time, new pathways form that help the model retain and apply what it has learned to future situations, effectively giving the model a type of short-term memory to influence new inputs. However, unlike traditional LLMs, Dragon Hatchling’s memory comes from continuous adaptation within the architecture, rather than from context stored in the training data.

In testing, Dragon Hatchling performed similarly to GPT-2 on benchmark language modeling and translation tasks. This is an impressive feat for a brand new prototype architecture, the researchers note.

Although the paper has not yet been peer-reviewed, the team hopes the model will serve as a foundational step toward autonomously learning and adapting AI systems. In theory, this could mean that AI models get smarter the longer they’re online, for better or worse.


Source link

#Biotechnology #ClimateScience #Health #Science #ScientificAdvances #ScientificResearch
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleAntibiotic-resistant infections are on the rise, UKSHA warns
Next Article There is a weakness in Earth’s magnetic field that is growing stronger, endangering astronauts and satellites.
user
  • Website

Related Posts

All major galaxies except one are moving away from the Milky Way – and we finally know why

February 6, 2026

“Mono” viruses increase the risk of MS and cancer for some people. 22 genes suggest why.

February 5, 2026

The spotted orchid fly has invaded the United States. They may have acquired evolutionary superpowers in Chinese cities.

February 5, 2026
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

Benchmark raises $225 million in special funding to double Cerebras

AI startup founder says he plans a ‘March for Billionaires’ to protest California’s wealth tax

From Svedka to Anthropic, brands are boldly leveraging AI in their Super Bowl ads

Prince Andrew’s advisor encouraged Jeffrey Epstein to invest in EV startups like Lucid Motors

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

Castilla-La Mancha Ignites Innovation: fiveclmsummit Redefines Tech Future

Local Power, Health Innovation: Alcolea de Calatrava Boosts FiveCLM PoC with Community Engagement

The Future of Digital Twins in Healthcare: From Virtual Replicas to Personalized Medical Models

Human Digital Twins: The Next Tech Frontier Set to Transform Healthcare and Beyond

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2026 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.