University of Copenhagen – Devstyler.io https://devstyler.io News for developers from tech to lifestyle Wed, 07 Jul 2021 10:34:59 +0000 en-US hourly 1 https://wordpress.org/?v=6.5.5 A new Invention Aims to Make Computer Servers Worldwide More Climate Friendly https://devstyler.io/blog/2021/07/07/a-new-invention-aims-to-make-computer-servers-worldwide-more-climate-friendly/ Wed, 07 Jul 2021 10:34:59 +0000 https://devstyler.io/?p=58148 ...]]> An elegant new algorithm developed by Danish researchers can significantly reduce the resource consumption of the world’s computer servers. Computer servers are as taxing on the climate as global air traffic combined, thereby making the green transition in IT an urgent matter. The researchers, from the University of Copenhagen, expect major IT companies to deploy the algorithm immediately.

One of the flipsides of our runaway internet usage is its impact on climate due to the massive amount of electricity consumed by computer servers. Current CO2 emissions from data centres are as high as from global air traffic combined—with emissions expected to double within just a few years.

Only a handful of years have passed since Professor Mikkel Thorup was among a group of researchers behind an algorithm that addressed part of this problem by producing a groundbreaking recipe to streamline computer server workflows. Their work saved energy and resources. Tech giants including Vimeo and Google enthusiastically implemented the algorithm in their systems, with online video platform Vimeo reporting that the algorithm had reduced their bandwidth usage by a factor of eight.

Now, Thorup and two fellow UCPH researchers have perfected the already clever algorithm, making it possible to address a fundamental problem in computer systems—the fact that some servers become overloaded while other servers have capacity left—many times faster than today. Professor Thorup of the University of Copenhagen’s Department of Computer Science, who developed the algorithm alongside department colleagues Anders Armand and Jakob Bæk Tejs Knudsen, commented:

“We have found an algorithm that removes one of the major causes of overloaded servers once and for all. Our initial algorithm was a huge improvement over the way the industry had been doing things, but this version is many times better and reduces resource usage to the greatest extent possible. Furthermore, it is free to use for all.”

The algorithm addresses the problem of servers becoming overloaded as they receive more requests from clients than they have the capacity to handle. This happens as users pile in to watch a certain Vimeo video or Netflix film. As a result, systems often need to shift clients around many times to achieve a balanced distribution among servers.

The mathematical calculation required to achieve this balancing act is extraordinarily difficult as up to a billion servers can be involved in the system. And, it is ever-volatile as new clients and servers join and leave. This leads to congestion and server breakdowns, as well as resource consumption that influences the overall climate impact. Thorup also explains

“As internet traffic soars explosively, the problem will continue to grow. Therefore, we need a scalable solution that doesn’t depend on the number of servers involved. Our algorithm provides exactly such a solution.”

According to the American IT firm Cisco, internet traffic is projected to triple between 2017 and 2022. Next year, online videos will make up 82% of all internet traffic. The new algorithm ensures that clients are distributed as evenly as possible among servers, by moving them around as little as possible, and by retrieving content as locally as possible.

For example, to ensure that client distribution among servers balances so that no server is more than 10% more burdened than others, the old algorithm might deal with an update by moving a client one hundred times. The new algorithm reduces this to 10 moves, even when there are billions of clients and servers in the system. Mathematically stated: If the balance is to be kept within a factor of 1+1/X, the improvement in the number of moves from X2 to X is generally impossible to improve upon.

As many large IT firms have already implemented Professor Thorup’s original algorithm, he believes that the industry will adopt the new one immediately—and that it may already be in use. Studies have demonstrated that global data centres consume more than 400 terawatt-hours of electricity annually. This accounts for approximately 2% of the world’s total greenhouse gas emissions and currently equals all emissions from global air traffic. Datacenter electricity consumption is expected to double by 2025.

According to the Danish Council on Climate Change, a single large data centre consumes the equivalent of 4% of Denmark’s total electricity consumption.

Mikkel Thorup is head of the BARC research centre (Basic Algorithms Research Copenhagen) at the University of Copenhagen’s Department of Computer Science. BARC has positioned Copenhagen as the world’s fourth-best place in basic research in the design and analysis of algorithms. BARC is funded by the VILLUM FOUNDATION.

]]>
Artificial Intelligence Enhances Efficacy of Sleep Disorder Treatments https://devstyler.io/blog/2021/06/09/artificial-intelligence-enhances-efficacy-of-sleep-disorder-treatments/ Wed, 09 Jun 2021 12:01:02 +0000 https://devstyler.io/?p=54189 ...]]> Difficulty sleeping, sleep apnea and narcolepsy are among a range of sleep disorders that thousands of Danes suffer from. It is estimated that sleep apnea is undiagnosed in as many as 200,000 Danes.

In a new study, researchers from the University of Copenhagen’s Department of Computer Science have collaborated with the Danish Center for Sleep Medicine at the danish hospital Rigshospitalet to develop an artificial intelligence algorithm that can improve diagnoses, treatments, and our overall understanding of sleep disorders. Mathias Perslev, a PhD at the Department of Computer Science and lead author of the study, commented:

“The algorithm is extraordinarily precise. We completed various tests in which its performance rivaled that of the best doctors in the field, worldwide.”

Can support doctors in their treatments

Today’s sleep disorder examinations typically begin with admittance to a sleep clinic. Here, a person’s night sleep is monitored using various measuring instruments. A specialist in sleep disorders then reviews the 7-8 hours of measurements from the patient’s overnight sleep.

The doctor manually divides these 7-8 hours of sleep into 30-second intervals, all of which must be categorized into different sleep phases, such as REM (rapid eye movement) sleep, light sleep, deep sleep, etc. It is a time-consuming job that the algorithm can perform in seconds. Poul Jennum, professor of neurophysiology and Head of the Danish Center for Sleep Medicine, explains:

“This project has allowed us to prove that these measurements can be very safely made using machine learning — which has great significance. By saving many hours of work, many more patients can be assessed and diagnosed effectively.”

In the Capital Region of Denmark alone, more than 4,000 polysomnography tests — known as PSG or sleep studies — are conducted annually on patients with sleep apnea and more complicated sleeping disorders. It takes 1.5-3 hours for a doctor to analyze a PSG study. Thus, in the Capital Region of Denmark alone, between 6,000 and 12,000 medical hours could be freed up by deploying the new algorithm.

The algorithm functions across sleep clinics and patient groups

By collecting data from a variety of sources, the researchers behind the algorithm have been able to ensure optimal functionality. In all, 20,000 nights of sleep from the United States and a host of European countries have been collected and used to train the algorithm. Mathias Perslev and Christian Igel, who led the project on the computer science side, added:

“We have collected sleep data from across continents, sleep clinics and patient groups. The fact that the algorithm works well under such diverse conditions is a breakthrough. Achieving this kind of generalization is one of the greatest challenges in medical data analysis.”

They hope that the algorithm will serve to help doctors and researchers around the world to learn more about sleep disorders in the future. The sleep analysis software is freely available at sleep.ai.Ku.dk and can be used by anyone, anywhere — including places where there isn’t a sleep clinic around the corner. Mathias Perslev concluded:

“Just a few measurements taken by common clinical instruments are required for this algorithm. So, use of this software could be particularly relevant in developing countries where one may not have access to the latest equipment or an expert.”

The researchers are now working with Danish physicians to get the software and algorithm approved for clinical use.

]]>
New AI Tool Claims to Predict Who Will Die From Covid https://devstyler.io/blog/2021/02/10/new-ai-tool-predicts-who-ll-die-from-covid-19-with-up-to-90-accuracy/ Wed, 10 Feb 2021 12:32:48 +0000 https://devstyler.io/?p=40742 ...]]> Scientists from the University of Copenhagen have developed an AI tool that can predict who’ll die from COVID-19 with up to 90% accuracy, TNW announced

This tool is believed to be able to determine whether an uninfected person who later catches the virus will die from the disease with up to 90% certainty. It also claims to predict whether someone who’s admitted to hospital with COVID-19 will need a respirator with 80% accuracy.

It can also help identify who to prioritize for vaccines and how many respirators a hospital will need. Professor Mads Nielsen from University of Copenhagen added:

“We are working towards a goal that we should be able to predict the need for respirators five days ahead by giving the computer access to health data on all COVID positives in the region. The computer will never be able to replace a doctor’s assessment, but it can help doctors and hospitals see many COVID-19 infected patients at once and set ongoing priorities.”

The study showed that the most decisive indicators were unsurprisingly BMI and age. Males and people with high blood pressure or neurological disease also have an elevated risk. The other most influential factors were having chronic obstructive pulmonary disease (COPD), asthma, diabetes, and heart disease. Professor Nielsen also added:

“For those affected by one or more of these parameters, we have found that it may make sense to move them up in the vaccine queue, to avoid any risk of them becoming infected and eventually ending up on a respirator.”

You can check the whole study journal from Scientific Reports

]]>