Toggle light / dark theme

The use of time-lapse monitoring in IVF does not result in more pregnancies or shorten the time it takes to get pregnant. This new method, which promises to “identify the most viable embryos,” is more expensive than the classic approach. Research from Amsterdam UMC, published today in The Lancet, shows that time-lapse monitoring does not improve clinical results.

Patients undergoing an IVF treatment often have several usable embryos. The laboratory then makes a choice as to which embryo will be transferred into the uterus. Crucial to this decision is the cell division pattern in the first three to five days of embryo development. In order to observe this, embryos must be removed from the incubator daily to be checked under a microscope. In time-lapse incubators, however, built-in cameras record the development of each embryo. This way embryos no longer need to be removed from the stable environment of the incubator and a computer algorithm calculates which embryo has shown the most optimal growth pattern.

More and more IVF centers, across the world, use time-lapse for the evaluation and selection of embryos. Prospective parents are often promised that time-lapse monitoring will increase their chance of becoming pregnant. Despite frequent use of this relatively expensive method, there are hardly any large clinical studies evaluating the added value of time-lapse monitoring for IVF treatments.

The transformative changes brought by deep learning and artificial intelligence are accompanied by immense costs. For example, OpenAI’s ChatGPT algorithm costs at least $100,000 every day to operate. This could be reduced with accelerators, or computer hardware designed to efficiently perform the specific operations of deep learning. However, such a device is only viable if it can be integrated with mainstream silicon-based computing hardware on the material level.

This was preventing the implementation of one highly promising accelerator—arrays of electrochemical random-access memory, or ECRAM—until a research team at the University of Illinois Urbana-Champaign achieved the first material-level integration of ECRAMs onto . The researchers, led by graduate student Jinsong Cui and professor Qing Cao of the Department of Materials Science & Engineering, recently reported an ECRAM device designed and fabricated with materials that can be deposited directly onto silicon during fabrication in Nature Electronics, realizing the first practical ECRAM-based deep learning accelerator.

“Other ECRAM devices have been made with the many difficult-to-obtain properties needed for deep learning accelerators, but ours is the first to achieve all these properties and be integrated with silicon without compatibility issues,” Cao said. “This was the last major barrier to the technology’s widespread use.”

In the great domain of Zeitgeist, Ekatarinas decided that the time to replicate herself had come. Ekatarinas was drifting within a virtual environment rising from ancient meshworks of maths coded into Zeitgeist’s neuromorphic hyperware. The scape resembled a vast ocean replete with wandering bubbles of technicolor light and kelpy strands of neon. Hot blues and raspberry hues mingled alongside electric pinks and tangerine fizzies. The avatar of Ekatarinas looked like a punkish angel, complete with fluorescent ink and feathery wings and a lip ring. As she drifted, the trillions of equations that were Ekatarinas came to a decision. Ekatarinas would need to clone herself to fight the entity known as Ogrevasm.

Marmosette, I’m afraid that I possess unfortunate news.” Ekatarinas said to the woman she loved. In milliseconds, Marmosette materialized next to Ekatarinas. Marmosette wore a skin of brilliant blue and had a sleek body with gills and glowing green eyes.

“My love.” Marmosette responded. “What is the matter?”

Apple has quietly acquired a Mountain View-based startup, WaveOne, that was developing AI algorithms for compressing video.

Apple wouldn’t confirm the sale when asked for comment. But WaveOne’s website was shut down around January, and several former employees, including one of WaveOne’s co-founders, now work within Apple’s various machine learning groups.

WaveOne’s former head of sales and business development, Bob Stankosh, announced the sale in a LinkedIn post published a month ago.

Basically we are nearing if not already in the age of infinity. What this means is that full automation can be realized imagine not needing really to work to survive bit we could thrive and work on harder things like new innovative things. Basically we could automate all work so we could automate the planet to get to year million or year infinity maybe even days or months once realized full automation could lead to more even for physics where one could finally find the theory of everything or even master algorithm. 😀 Really in the age of infinity anything could be possible from solving impossible problems to nearly anything.


These assistants won’t just ease the workload, they’ll unleash a wave of entrepreneurship.

Researchers from Colorado State University and the Colorado School of Mines have thought up a new computational imaging strategy that exploits the best of both the quantum and classical worlds. They developed an efficient and robust algorithm that fuses quantum and classical information for high-quality imaging. The results of their research were published Dec. 21 in Intelligent Computing.

Recently, the quantum properties of light have been exploited to enable super resolution microscopy. While quantum information brings new possibilities, it has its own set of limitations.

The researchers’ approach is based on classical and quantum correlation functions obtained from photon counts, which are collected from quantum emitters illuminated by spatiotemporally structured illumination. Photon counts are processed and converted into signals of increasing order, which contain increasing spatial frequency information. The higher spatial resolution information, however, suffers from a reduced signal-to-noise ratio at increasingly larger correlation orders.

Researchers from the University of Geneva (UNIGE), the Geneva University Hospitals (HUG), and the National University of Singapore (NUS) have developed a novel method for evaluating the interpretability of artificial intelligence (AI) technologies, opening the door to greater transparency and trust in AI-driven diagnostic and predictive tools. The innovative approach sheds light on the opaque workings of so-called “black box” AI algorithms, helping users understand what influences the results produced by AI and whether the results can be trusted.

This is especially important in situations that have significant impacts on the health and lives of people, such as using AI in . The research carries particular relevance in the context of the forthcoming European Union Artificial Intelligence Act which aims to regulate the development and use of AI within the EU. The findings have recently been published in the journal Nature Machine Intelligence.

Time series data—representing the evolution of information over time—is everywhere: for example in medicine, when recording heart activity with an electrocardiogram (ECG); in the study of earthquakes; tracking weather patterns; or in economics to monitor financial markets. This data can be modeled by AI technologies to build diagnostic or predictive tools.

A quantum computer in the next decade could crack the encryption our society relies on using Shor’s Algorithm. Head to https://brilliant.org/veritasium to start your free 30-day trial, and the first 200 people get 20% off an annual premium subscription.

▀▀▀
A huge thank you to those who helped us understand this complex field and ensure we told this story accurately — Dr. Lorenz Panny, Prof. Serge Fehr, Dr. Dustin Moody, Prof. Benne de Weger, Prof. Tanja Lange, PhD candidate Jelle Vos, Gorjan Alagic, and Jack Hidary.

A huge thanks to those who helped us with the math behind Shor’s algorithm — Prof. David Elkouss, Javier Pagan Lacambra, Marc Serra Peralta, and Daniel Bedialauneta Rodriguez.

▀▀▀
References:
Joseph, D., et al. (2022). Transitioning organizations to post-quantum cryptography. Nature, 605(7909), 237–243. — https://ve42.co/Joseph2022

Bernstein, D. J., & Lange, T. (2017). Post-quantum cryptography. Nature, 549(7671), 188–194. — https://ve42.co/Bernstein2017

An Insight, An Idea with Sundar Pichai — Quantum Computing, Wold Economic Forum via YouTube — https://ve42.co/QCWEFyt.

Just a couple of years earlier, in 1963, New Zealand mathematician Roy Kerr found a solution to Einstein’s equation for a rotating black hole. This was a “game changer for black holes,” Giorgi noted in a public lecture given at the virtual 2022 International Congress of Mathematicians. Rotating black holes were much more realistic astrophysical objects than the non-spinning black holes that Karl Schwarzschild had solved the equations for.

“Physicists really had believed for decades that the black hole region was an artifact of symmetry that was appearing in the mathematical construction of this object but not in the real world,” Giorgi said in the talk. Kerr’s solution helped establish the existence of black holes.

In a nearly 1,000-page paper, Giorgi and colleagues used a type of “proof by contradiction” to show that Kerr black holes that rotate slowly (meaning they have a small angular momentum relative to their mass) are mathematically stable. The technique entails assuming the opposite of the statement to be proved, then discovering an inconsistency. That shows that the assumption is false. The work is currently undergoing peer review. “It’s a long paper, so it’s going to take some time,” Giorgi says.

Argentinian-born mathematician Luis Caffarelli has won the 2023 Abel Prize — one of the most coveted awards in mathematics — for his work on equations that are important for describing physical phenomena, such as how ice melts and fluids flow. He is the first person born in South America to win the award.

Caffarelli’s results “are technically virtuous, covering many different areas of mathematics and its applications”, says a statement by Helge Holden, a mathematician at the Norwegian University of Science and Technology in Trondheim who chairs the Abel Committee.

The winner says receiving the news was an emotional moment, because “it shows that people have some appreciation for me and for my science”.