{"id":29773,"date":"2026-03-24T08:45:00","date_gmt":"2026-03-24T13:45:00","guid":{"rendered":"https:\/\/www.ecoticias.com\/en\/?p=29773"},"modified":"2026-03-24T04:39:14","modified_gmt":"2026-03-24T09:39:14","slug":"in-2026-an-ai-is-challenged-to-design-life-from-scratch-and-the-unthinkable-happens-it-starts-with-blind-creatures-and-ends-up-developing-a-functional-visual-system-without-instructions-as-if-evol","status":"publish","type":"post","link":"https:\/\/www.ecoticias.com\/en\/in-2026-an-ai-is-challenged-to-design-life-from-scratch-and-the-unthinkable-happens-it-starts-with-blind-creatures-and-ends-up-developing-a-functional-visual-system-without-instructions-as-if-evol\/29773\/","title":{"rendered":"In 2026, an AI is challenged to design life from scratch, and the unthinkable happens: it starts with blind creatures and ends up developing a functional visual system without instructions, as if evolution had \u201csneaked\u201d into the code"},"content":{"rendered":"\n<p>What if you could rewind the evolution of eyesight and watch it unfold in fast forward on a laptop screen? That is essentially what a team of researchers has done by letting artificial animals evolve eyes inside a virtual world. The result is a digital ecosystem where blind creatures slowly learn to see, and their eyes end up looking surprisingly similar to those found in nature.<\/p>\n\n\n\n<p>The work, led by scientists at Lund University together with colleagues at Massachusetts Institute of Technology (MIT), is described in the journal <a href=\"https:\/\/www.science.org\/doi\/10.1126\/sciadv.ady2888\" target=\"_blank\" rel=\"noopener\"><em>Science Advances<\/em><\/a> and in a recent university press release. The team used artificial intelligence not to recognize cats in photos, but to \u201creplay\u201d millions of years of evolution inside a controlled simulation.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Evolution inside a digital world<\/h2>\n\n\n\n<p>In their experiment, the researchers created tiny virtual organisms and dropped them into a synthetic environment built entirely from code. At the start, these creatures were completely blind. <\/p>\n\n\n\n<div class=\"gb-element-a00da4e5\">\n<div><div class=\"gb-looper-46613eed\">\n<div class=\"gb-loop-item gb-loop-item-a8390598 post-32070 post type-post status-publish format-standard has-post-thumbnail hentry category-science resize-featured-image\">\n<h3 class=\"gb-text gb-text-24a51617\">Read More: <a href=\"https:\/\/www.ecoticias.com\/en\/naked-mole-rat-queens-are-famous-for-bloody-power-struggles-but-new-research-shows-they-can-also-transfer-power-peacefully-when-a-colony-changes-from-within\/32070\/\">Naked mole-rat queens are famous for bloody power struggles, but new research shows they can also transfer power peacefully when a colony changes from within<\/a><\/h3>\n<\/div>\n<\/div><\/div>\n<\/div>\n\n\n\n<p>Each had a simple body, a rudimentary nervous system and basic sensors that could, at best, detect light. Their tasks were familiar from the real world. Move through a maze. Avoid obstacles. Find \u201cfood\u201d and stay away from \u201cpoison.\u201d<\/p>\n\n\n\n<p>Generation after generation, the system introduced random variations. Digital animals that navigated better survived and passed on their traits. Those that failed simply disappeared from the gene pool. It is natural selection, only compressed into hours of computing time instead of eons. As the virtual climate and tasks stayed constant, the creatures had to adapt or vanish.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Eyes that look strangely familiar<\/h2>\n\n\n\n<p>Over time, simple light-sensitive patches turned into more elaborate structures that could tell the difference between dark and light, then between shapes, and eventually between different objects. <\/p>\n\n\n\n<p>The simulation produced several well-known <a href=\"https:\/\/www.cambridge.org\/core\/journals\/visual-neuroscience\/article\/eye-evolution-and-its-functional-basis\/E632F655150C8D0E7367566CC99F4717\" target=\"_blank\" rel=\"noopener\">eye types<\/a> that biologists see in real animals, including dispersed photoreceptors, compound eyes and camera-like eyes that concentrate light onto a retina.<\/p>\n\n\n\n<p>Professor Dan Eric Nilsson, an evolutionary biologist at Lund, put it plainly. \u201cWe have succeeded in creating artificial evolution that produces the same results as in real life.\u201d He added that the most surprising part was how closely the computer-grown eyes mirrored those of actual organisms, even though the digital world was very simplified.<\/p>\n\n\n\n<div class=\"gb-element-58005219\">\n<div><div class=\"gb-looper-27bee5b9\">\n<div class=\"gb-loop-item gb-loop-item-f166c790 post-29732 post type-post status-publish format-standard has-post-thumbnail hentry category-economy resize-featured-image\">\n<h3 class=\"gb-text gb-text-34a31a26\">Read More: <a href=\"https:\/\/www.ecoticias.com\/en\/another-theme-park-goes-bankrupt-and-closes-its-doors-forever-what-has-happened-in-recent-weeks-is-more-serious-than-it-seems\/29732\/\">Another theme park goes bankrupt and closes its doors \u201cforever\u201d: what has happened in recent weeks is more serious than it seems<\/a><\/h3>\n<\/div>\n<\/div><\/div>\n<\/div>\n\n\n\n<p>The <em>Science Advances<\/em> paper goes further and shows that the type of eye that evolves depends heavily on the job that needs to be done. In navigation tasks, agents tended to evolve wide, low-resolution vision, similar to compound eyes that are good for spotting motion and avoiding collisions. <\/p>\n\n\n\n<p>When the task shifted to detecting and recognizing specific objects, the winning design looked more like a camera eye, with a focused central field and higher acuity. In later experiments, even lens-like structures emerged to balance sharp vision with enough light, echoing the same trade-offs that shaped eyes in real ecosystems.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What this has to do with ecology<\/h2>\n\n\n\n<p>In forests, coral reefs and open oceans, eyes are central to survival. Animals scan for predators, locate prey, choose mates and navigate complex habitats, often under tough lighting conditions.<\/p>\n\n\n\n<p>Field biologists know that eye shape and retinal layout tend to match an animal\u2019s niche, but it is hard to test \u201cwhat if\u201d questions in nature. You cannot rerun evolution just to see what would happen if a reef got darker or a prey species became harder to spot.<\/p>\n\n\n\n<p>This digital framework offers a kind of eco laboratory on a chip. By changing the virtual environment or the task, scientists can see which visual strategies appear and which ones fail.<\/p>\n\n\n\n<p>The study even hints at evolutionary arms races, where more challenging detection tasks push agents toward sharper vision and more neural processing power, much like real predators and prey that continually force each other to improve.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">From wild eyes to smart sensors<\/h2>\n\n\n\n<p>The same tool that helps explain how a crab or bird might see its world could also influence devices we use every day. The MIT team notes that their <a href=\"https:\/\/news.mit.edu\/2025\/scientific-sandbox-lets-researchers-explore-evolution-vision-systems-1217\" target=\"_blank\" rel=\"noopener\">\u201cscientific sandbox\u201d<\/a> could guide the design of new sensors and cameras for robots, drones and wearable devices that must balance image quality with energy use and manufacturing limits. <\/p>\n\n\n\n<p>Today, some of those machines already inspect crops, fly over forests or survey coastlines. In the future, designs inspired by this kind of artificial evolution might help them work more efficiently in harsh outdoor conditions.<\/p>\n\n\n\n<div class=\"gb-element-93f95002\">\n<div><div class=\"gb-looper-71b48d31\">\n<div class=\"gb-loop-item gb-loop-item-0943bd6e post-29510 post type-post status-publish format-standard has-post-thumbnail hentry category-science resize-featured-image\">\n<h3 class=\"gb-text gb-text-bbf35838\">Read More: <a href=\"https:\/\/www.ecoticias.com\/en\/a-young-man-aged-just-15-is-about-to-officially-become-a-doctor-of-quantum-physics-in-antwerp-and-what-is-most-surprising-is-that-he-already-lives-in-munich-where-he-is-preparing-a-second-doctorate\/29510\/\">A young man aged just 15 is about to officially become a doctor of quantum physics in Antwerp, and what is most surprising is that he already lives in Munich, where he is preparing a second doctorate focused on medicine and artificial intelligence<\/a><\/h3>\n<\/div>\n<\/div><\/div>\n<\/div>\n\n\n\n<p>The researchers are careful to point out that a virtual world can never capture all the messy details of real ecosystems. Yet, by letting evolution play out in silico, they gain a powerful way to test ideas about how vision and behavior co-evolve, then bring those questions back to the field and the lab. <\/p>\n\n\n\n<p>As Nilsson put it, this is only the beginning, and AI can now help explore evolutionary futures that nature has not reached yet.<\/p>\n\n\n\n<p>The press release was published by <a href=\"https:\/\/www.lunduniversity.lu.se\/article\/researchers-create-ai-animals-simulate-evolution-vision\" target=\"_blank\" rel=\"noopener\"><em>Lund University<\/em><\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>What if you could rewind the evolution of eyesight and watch it unfold in fast forward on a laptop screen? &#8230; <\/p>\n<p class=\"read-more-container\"><a title=\"In 2026, an AI is challenged to design life from scratch, and the unthinkable happens: it starts with blind creatures and ends up developing a functional visual system without instructions, as if evolution had \u201csneaked\u201d into the code\" class=\"read-more button\" href=\"https:\/\/www.ecoticias.com\/en\/in-2026-an-ai-is-challenged-to-design-life-from-scratch-and-the-unthinkable-happens-it-starts-with-blind-creatures-and-ends-up-developing-a-functional-visual-system-without-instructions-as-if-evol\/29773\/#more-29773\" aria-label=\"Read more about In 2026, an AI is challenged to design life from scratch, and the unthinkable happens: it starts with blind creatures and ends up developing a functional visual system without instructions, as if evolution had \u201csneaked\u201d into the code\">Read more<\/a><\/p>\n","protected":false},"author":13,"featured_media":29776,"comment_status":"open","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7],"tags":[],"class_list":["post-29773","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-technology","resize-featured-image"],"_links":{"self":[{"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/posts\/29773","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/comments?post=29773"}],"version-history":[{"count":5,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/posts\/29773\/revisions"}],"predecessor-version":[{"id":29832,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/posts\/29773\/revisions\/29832"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/media\/29776"}],"wp:attachment":[{"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/media?parent=29773"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/categories?post=29773"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.ecoticias.com\/en\/wp-json\/wp\/v2\/tags?post=29773"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}