Monthly Archives: April 2016

Demonstration of virtualized 5G architecture

the next generation communication network which will support the Internet of Things, by which billions of devices will become connected — will demand a far more complex infrastructure than existing networks and require a high level of ongoing optimisation and maintenance. Currently, operating expenses represent a major cost for network operators, who typically pay vendors to install bespoke equipment and subsequently carry out each software update and patch.

The virtualised 5G architecture is orchestrated to the cloud and based on off-the-shelf Intel-based server blades running Linux OS. This means that the operator can rapidly deploy multiple Virtual Network Functions (VNFs) as Network Services, and no longer requires engineers to go out to the network’s physical sites to perform upgrades. It also enables operators to buy software from different vendors. The speed of deployment of VNFs on the FDC is around ten minutes — compared to tens of days for traditional deployment.

The virtualisation demonstration has been produced in association with the EC Horizon 2020 virtualisation project SoftFire, and operates using the FOKUS developed ‘OpenBaton’ orchestrator and established industry VNF controller ‘OpenStack’. The demonstration has been performed by researchers and testbed staff at the University of Surrey in collaboration with Cisco, Huawei and Quortus.

Developed and prototyped by the 5GIC over the past 18 months, the FDC utilises user and network context information in order to provide a more connected experience over a dynamic and distributed cloud based architecture, providing user benefits including better connection and faster throughput.

Professor Rahim Tafazolli, Head of the 5GIC, said, “This successful demonstration of the FDC is a huge step forward towards the development of a viable 5G network that supports mobile broadband, Internet of things and high quality applications such as Ultra High Definition video, Virtual and Augmented Reality applications. The next step for the 5GIC team will be to demonstrate FDC-based network slicing — the partitioning of network resources for different purposes to create the perception of infinite capacity.”

Secure processing of patient data revealed

Collecting and analysing medical data on a large scale is an increasingly important research tool in understanding illnesses. To quickly arrive at new insights and avoid double work, it is important that international researchers work together to use and enrich one another’s data. Such studies often involve sensitive patient information. Patients must be confident that their privacy will be safeguarded and their data securely stored in line with upcoming European regulations on privacy, known as the strictest in the world.

To make this possible, Professor Bart Jacobs and Professor Eric Verheul, both computer scientists at Radboud University, have developed the Polymorphic Encryption and Pseudonymisation (PEP) technique. The PEP technique realises this goal by pseudonymising and encrypting data in such a way that the data cannot be accessed even by the party who stores the data. Moreover, access to the data is strictly regulated and monitored. The PEP technique makes it possible to analyse data from a study while ensuring that a patient’s privacy is safeguarded.

One of the first applications of the PEP technique is a study of Parkinson’s that was initiated by Radboud University. In this study, 650 people with Parkinson’s will be monitored for two years by means of, among other things, portable measuring equipment (wearables). Thanks to the PEP technique, the research data collected in the Netherlands can be shared in pseudonymised form with top researchers throughout the world.

Public investment in privacy

“In the context of international medical research, personal information is worth its weight in gold. So it’s important for the government to invest in an infrastructure that guarantees the protection of this information,” said Bart Jacobs, Professor of Digital Security at Radboud University. “Especially to ensure that people will remain willing to participate in future studies of this sort.” Radboud University and Radboud university medical center are investing €920,000 in the development of the PEP software. The Province of Gelderland is contributing €750,000. The software will be made available as open source so that other parties may also use it.

Bart Jacobs is optimistic about the future of the PEP system. “The study of Parkinson’s should demonstrate the usefulness of PEP. With this showcase as an example, PEP could grow to become the international standard for storing and exchanging privacy-sensitive medical data.” The first reactions from the field are positive, Jacobs concluded.

In short, Polymorphic Encryption and Pseudonymisation works as follows:

· the managers of the data cannot access the data

· participants in the study decide for each study if they want to allow their data to be used

· researchers who use the data are given a unique key

· the participants have a different pseudonym for each researcher. This prevents researchers from using another route to access data that they are not allowed to see.

Plant ecological research

images-30Long-term, broad-scale ecological data are critical to plant research, but often impossible to collect on foot. Traditional data-collection methods can be time consuming or dangerous, and can compromise habitats that are sensitive to human impact. Micro-unmanned aerial vehicles (UAVs), or drones, eliminate these data-collection pitfalls by flying over landscapes to gather unobtrusive aerial image data.

A new review in a recent issue of Applications in Plant Sciencesexplores when and how to use drones in plant research. “The potential of drone technology in research may only be limited by our ability to envision novel applications,” comments Mitch Cruzan, lead author of the review and professor in the Department of Biology at Portland State University. Drones can amass vegetation data over seasons or years for monitoring habitat restoration efforts, monitoring rare and threatened plant populations, surveying agriculture, and measuring carbon storage. “This technology,” says Cruzan, “has the potential for the acquisition of large amounts of information with minimal effort and disruption of natural habitats.”

For some research questions, drone surveys could be the holy grail of ecological data. Drone-captured images can map individual species in the landscape depending on the uniqueness of the spectral light values created from plant leaf or flower colors. Drones can also be paired with 3D technology to measure plant height and size. Scientists can use these images to study plant health, phenology, and reproduction, to track disease, and to survey human-mediated habitat disturbances.

Researchers can fly small drones along set transects over study areas of up to 40 hectares in size. An internal GPS system allows drones to hover over pinpointed locations and altitudes to collect repeatable, high-resolution images. Cruzan and colleagues warn researchers of “shadow gaps” when collecting data. Taller vegetation can obscure shorter vegetation, hiding them from view in aerial photographs. Thus, overlapping images are required to get the right angles to capture a full view of the landscape.

The review lists additional drone and operator requirements and desired features, including video feeds, camera stabilization, wide-angle lenses for data collection over larger areas, and must-have metadata on the drone’s altitude, speed, and elevation of every captured image.

After data collection, georeferenced images are stitched together into a digital surface model (DSM) to be analyzed. GIS and programming software classify vegetation types, landscape features, and even individual species in the DSMs using manual or automated, machine-learning techniques.

To test the effectiveness of drones, Cruzan and colleagues applied drone technology to a landscape genetics study of the Whetstone Savanna Preserve in southern Oregon, USA. “Our goal is to understand how landscape features affect pollen and seed dispersal for plant species associated with different dispersal vectors,” says Cruzan. They flew drones over vernal pools, which are threatened, seasonal wetlands. They analyzed the drone images to identify how landscape features mediate gene flow and plant dispersal in these patchy habitats. Mapping these habitats manually would have taken hundreds of hours and compromised these ecologically sensitive areas.

Before drones, the main option for aerial imaging data was light detection and ranging (LiDAR). LiDAR uses remote sensing technology to capture aerial images. However, LiDAR is expensive, requires highly specialized equipment and flyovers, and is most frequently used to capture data from a single point in time. “LIDAR surveys are conducted at a much higher elevation, so they are not useful for the more subtle differences in vegetation elevation that higher-resolution, low-elevation drone surveys can provide,” explains Cruzan.

Position interventional needles

images-31An ultrasound shows a shadow on the liver — but is it a tumor? Often, the only way to conclusively answer this question is to perform a biopsy, a procedure in which a doctor uses a long needle to remove a piece of the suspected tissue to be sent to a laboratory for testing. However, placing the biopsy needle with precision is far from easy. On one hand, the doctor needs to be sure of reaching the suspected tissue — and not healthy tissue just millimeters to the side. On the other hand, the needle must not damage veins, nerve pathways, and organs such as the lungs, and cannot penetrate bony structures such as ribs. To obtain an overview, doctors begin by performing a computed tomography scan, which they use to maneuver the needle to the correct position. The same challenges arise in treatments that use needles to direct heating, cooling, or high-energy beams into the cancerous tissue, thereby destroying the tumor.

Combining robot precision and doctors’ expertise

Soon, precisely positioning needles will become faster thanks to a robotic arm that researchers from the Fraunhofer Institute for Manufacturing Engineering and Automa-tion IPA’s Project group for Automation in Medicine and Biotechnology PAMB and the Fraunhofer Institute for Medical Image Computing MEVIS have modified specifically for this purpose. “Whereas humans struggle to position this sort of needle, it’s hard to beat a robot designed for the purpose,” says Andreas Rothfuss, a researcher at the PAMB. “Our system removes burdens for doctors while leaving them in control.” In other words, the robot does what it does best — locating the right path and positioning the needle guide so that there is no risk of hitting or injuring either doctor or patient. Thereafter, the doctor again takes command and inserts the needle into the tissue. “A human needs 30 minutes to position the needle, but with robot assistance this is cut down to five minutes at most,” says Rothfuss.

To begin the procedure, the doctor begins by performing a computed tomography scan of the patient. This time, however, the robot arm accompanies the scan using a calibration tool to determine the ideal position to target a specific point in the image. Software from Fraunhofer MEVIS analyzes the image and supports the doctor in placing the virtual needle by displaying the needle in the image. If, instead of a biopsy, the doctor is administering treatment — seeking to destroy a tumor by applying heat, for instance — the software simulates how the heat will spread through the tissue. The last step is to determine the number of needles and their positions required to kill off the entire tumor. Thereafter, the robot arm’s calibration tool is replaced with a needle guide. The robot transports the guide to the calculated position and places it on the skin at the correct angle. However, it does not insert the needle itself: this is left to the doctor, who pushes the needle into the tissue step by step through the needle guide held in place by the robot.

Less radiation exposure for doctor and patient

To ensure that the needle is in the planned position, doctors take X-rays as part of the standard procedure as they insert the needle into the tissue. Here, too, the robot offers several advantages. In conventional needle insertion, doctors hold the needle in place manually, obscuring a part of the X-ray. This also exposes doctors’ hands to radiation each time a monitoring image is taken. Now, the robot, impervious to radiation, can hold the needle in place with its needle guide. There is also a significant reduction in the patient’s radiation exposure — the doctor inserts the needle through the guide, eliminating needle slippage. As a result, the number of monitoring X-rays is greatly reduced.

The researchers will showcase their development at the MEDICA trade fair in Düsseldorf from November 14 to 17 (Hall 10, Booth G05). The robot arm will be positioning its needle guide over a transparent plastic box complete with artificial ribs and a tumor embedded in a transparent polymer. This will allow visitors to see exactly where the needle is. Researchers hope that the system could reach the market in around three years.

New way found to deal with Ransomware

The answer, they say, lies not in keeping it out of a computer but rather in confronting it once it’s there and, counterintuitively, actually letting it lock up a few files before clamping down on it.

“Our system is more of an early-warning system. It doesn’t prevent the ransomware from starting … it prevents the ransomware from completing its task … so you lose only a couple of pictures or a couple of documents rather than everything that’s on your hard drive, and it relieves you of the burden of having to pay the ransom,” said Nolen Scaife, a UF doctoral student and founding member of UF’s Florida Institute for Cybersecurity Research.

Scaife is part of the team that has come up with the ransomware solution, which it calls CryptoDrop.

Ransomware attacks have become one of the most urgent problems in the digital world. The FBI issued a warning in May saying the number of attacks has doubled in the past year and is expected to grow even more rapidly this year.

It said it received more than 2,400 complaints last year and estimated losses from such attacks at $24 million last year for individuals and businesses.

Attackers are typically shadowy figures from other countries lurking on the Dark Web and difficult, if not impossible, to find. Victims include not only individuals but also governments, industry, health care providers, educational institutions and financials entities.

Attacks most often show up in the form of an email that appears to be from someone familiar. The recipient clicks on a link in the email and unknowingly unleashes malware that encrypts his or her data. The next thing to appear is a message demanding the ransom, typically anywhere from a few hundred to a few thousand dollars.

“It’s an incredibly easy way to monetize a bad use of software,” said Patrick Traynor, an associate professor in UF’s department of computer and information science and engineering at UF and also a member of the Florida Institute for Cybersecurity Research. He and Scaife worked together on developing CryptoDrop.

Some companies have simply resigned themselves to that inevitability and budgeted money to cover ransoms, which usually must be paid in Bitcoin, a digital currency that defies tracing.

Ransomware attacks are effective because, quite simply, they work.

Antivirus software is successful at stopping them when it recognizes ransomware malware, but therein lies the problem.

“These attacks are tailored and unique every time they get installed on someone’s system,” Scaife said. “Antivirus is really good at stopping things it’s seen before … That’s where our solution is better than traditional anti-viruses. If something that’s benign starts to behave maliciously, then what we can do is take action against that based on what we see is happening to your data. So we can stop, for example, all of your pictures form being encrypted.”

Scaife, Traynor and colleagues Kevin Butler at UF and Henry Carter at Villanova University lay out the solution in a paper accepted for publication at the IEEE International Conference on Distributed Computing Systems and scheduled to be presented June 29 in Nara, Japan.

The results, they said, were impressive.

“We ran our detector against several hundred ransomware samples that were live,” Scaife said, “and in those case it detected 100 percent of those malware samples and it did so after only a median of 10 files were encrypted.”

America tweet again

Computer scientists from the University of Utah’s College of Engineering have developed what they call “sentiment analysis” software that can automatically determine how someone feels based on what they write or say. To test out the accuracy of this software’s machine-learning model, the team used it to analyze the individual sentiments of more than 1.6 million (and counting) geo-tagged tweets about the U.S. presidential election over the last five months. A database of these tweets is then examined to determine whether states and their counties are leaning toward the Republicans or Democrats.

“With sentiment analysis, it will try to predict the emotions behind every human being when he or she is talking or writing something,” says Debjyoti Paul, a doctoral student in the University of Utah’s School of Computing and the project leader along with School of Computing associate professor Feifei Li. “With that in mind, we are not just trying to look at the information in the tweets. We are trying to incorporate the emotion with the information.”

As a result of their work, the team has created an interactive website at in which users can find out if the tweets coming out of their state and its counties are more positive or negative toward Republicans or Democrats during any defined period of time since June 5. Also, the data can tell you the percentage of both positive and negative tweets toward a political party and when there was a surge for a particular type of tweet in the last five months.

Some interesting facts about this year’s U.S. presidential election based on a sample of what people are tweeting:

• Based on the number of positive tweets posted since June toward each party, the computer model predicts that Hillary Clinton will win the presidential election.

• Republicans sent out 17 percent more political tweets than Democrats.

• Delaware was the only state in which a majority of tweets from all counties in the state were positive toward the same party — in this case, the Democrats.

• For the Republicans, South Dakota had the highest percentage of counties in which most of their tweets were positive toward the party (73 percent of the counties).

• The biggest surge of positive tweets for Republicans was during the Republican National Convention on June 18 and when the video of Donald Trump boasting about groping women was leaked Oct. 7 (presumably defenders of Trump tweeting their support of him).

• The largest surge of positive tweets for Democrats was after the last two presidential debates and after the New York Times published its story Oct. 1 that Donald Trump avoided paying federal taxes for nearly two decades.

• Not only did the number of positive tweets for Democrats peak after the last two debates and the Trump federal taxes story, it’s also when the most negative tweets about the Democratic Party were posted.

Analyzing the tweets

Paul and his team started with more than 250 million tweets posted around the world from June 5 to Oct. 30 and then weeded out all non-political tweets based on a system of keywords using advanced software. They were left with more than 1.6 million political tweets posted in the U.S.

Then those tweets were sifted through the team’s “sentiment analysis” software where each tweet was analyzed and assigned a score from 0 to 1 where 0 is the most negative sentiment, 1 is the most positive sentiment, and 0.5 is neutral. The scores are then collected in a database that can calculate a state or county’s political leanings in real time based on the tweets. The database is constantly updated with new tweets. To measure the accuracy of the model, the team compared its results to the New York Times Upshot election forecast website and found the state-by-state analysis was very similar.

“I think it works really well. It matches up with the major events that happened during this election season. That’s a good indicator that the results are accurate,” says Li. “We’re hoping to develop some more scientific measurements to confirm this observation for an upcoming paper, but the early results are very positive.”

Paul believes that their sentiment analysis software could be used to more accurately reflect the feelings of crowd-sourced opinions on the Internet, for example reviews of products on Amazon or restaurant reviews on Yelp, in which the software can “drill down to the individual sentences of the text” to determine a person’s true feelings about something, he says.