Imagine a house paint that not only makes a home look pleasing to the eye, but also supplies all of your home’s energy needs. Researchers in Australia have come up with a “solar paint” capable of absorbing moisture from the air and turning it into hydrogen fuel for clean energy.
Based at RMIT University in Melbourne, southern Australia, the research team has developed “solar paint”, containing a newly developed compound that acts like silica gel — that’s the stuff used in those little sachets that absorb moisture to keep things like food, medicines, and electronics in good shape. Besides damp climates, the solar paint will also be effective in, for example, hot and dry climates near oceans, with the absorbed vapor coming from the nearby sea water as it evaporates in the heat.
Research economists fear that automated, self-driving vehicles will have a negative impact on those that are deeply tied to traditional transportation business models and practices.
There are 1.7 million professional truck drivers in the United States and an additional 1.7 million operators of other commercial land vehicles. Policymakers must prepare for the possible elimination of many of these jobs.
The Center For The Future of Work intends to address the challenge by bringing economic expertise together with some of the world’s leaders in autonomous vehicle technology to forecast where and when these individuals might be displaced from their current jobs. With necessary data, they can begin to design policies that could better improve the dislocation of these workers, and have these policies in place before the disruptions emerge. A Heinz College student team is currently collaborating with the New America Foundation to create the first draft of a map in space and time that forecasts the potentially significant job losses associated with the commercial deployment of these technologies.
Hybrid truck and blue electric car on wireless charging lane
Although , no two people are believed to have identical fingerprints, researchers at the New York University Tandon School of Engineering and Michigan State University College of Engineering have found that partial similarities between prints are common enough that the fingerprint-based security systems used in mobile phones and other electronic devices can be more vulnerable than previously thought. The vulnerability lies in the fact that fingerprint-based authentication systems feature small sensors that do not capture a user’s full fingerprint. Instead, they scan and store partial fingerprints, and many phones allow users to enroll several different fingers in their authentication system. Identity is confirmed when a user’s fingerprint matches any one of the saved partial prints. The researchers hypothesized that there could be enough similarities among different people’s partial prints that one could create a “MasterPrint.”
Team leader Nasir Memon explained that the MasterPrint concept bears is akin to a hacker who attempts to crack a PIN-based system using a commonly adopted password such as 1234.
“About 4 percent of the time, the password 1234 will be correct, which is a relatively high probability when you’re just guessing,” said Memon. The research team set out to see if they could find a MasterPrint that could reveal a similar level of vulnerability. They found that certain attributes in human fingerprint patterns were common enough to raise security concerns.
Researchers say “As fingerprint sensors become smaller in size, it is imperative for the resolution of the sensors to be significantly improved in order for them to capture additional fingerprint features,” Ross said. “If resolution is not improved, the distinctiveness of a user’s fingerprint will be inevitably compromised. The empirical analysis conducted in this research clearly substantiates this.”
The Massachusetts Institute of Technology announced on Thursday a new center for autism research, launching with $20 million in initial funding courtesy of Broadcom (brcm) chief executive officer and MIT alum Hock Tan and former investment banker Lisa Yang.
The Hock E. Tan and K. Lisa Yang Center for Autism Research, which will fall under the rubric of MIT’s McGovern Institute for Brain Research, will investigate “the genetic, biological, and neural bases of autism spectrum disorder,” according to MIT. An estimated one of 68 children (and one in 42 boys) in the U.S. are affected by autism, according to the Centers for Disease Control.The Institute draws researchers not only from MIT ranks but from Harvard, biotech companies, and other local institutions, she said. “There’s a collaborative spirit and a lot of cross-pollination with the medical schools. It is not territorial.”
The benefactors, who are parents of two children on the autism spectrum, hope their donation will ignite more support and research for more understanding of the disorder and alleviate its impact on those affected, according to MIT’s statement.
A silicon wafer designed to sort particles found in bodily fluids for the purpose of early disease detection.
IBM’s research labs are already working on a chip that can diagnose a potentially fatal condition faster than the best lab in the country, a camera that can see so deeply into a pill it can tell if its molecular structure has more in common with a real or counterfeit tablet, and a system that can help identify if a patient has a mental illness just from the words they use.
More work have to be done before the systems are ready for rolling out commercially. The next few years could also see IBM using artificial intelligence and new analytical techniques to produce a ‘lab on a chip’ — a pocket-sized device that would be able to analyse a single drop of blood or other bodily fluid to find evidence of bacteria, viruses, or elements like proteins that could be indicative of an illness.
Perhaps its greatest use, however, could be allowing people to know about health conditions before any symptoms begin to show.
While analyzing the contents of a drop of blood at a nanoscale level will need huge AI processing power, the real challenge for IBM in bringing labs on a chip to market is in the silicon. Mental health, however, is one area where artificial intelligence will chew up vast quantities of data and turn it into useful information for clinicians. Over the next two years, IBM will be creating a prototype of a machine learning system that can help mental health professionals diagnose patients just from the content of their speech.
Speech is already one of the key components that doctors and psychiatrists will use to detect the onset of mental illness, checking for signs including the rate, volume, and choice of words. Now, IBM is hoping that artificial intelligence can do the same, by analyzing what a patient says or writes — from their consultations with a doctor or the content of their Twitter feeds.
IBM already has form with such tools: one of the first commercial uses of Watson, Big Blue’s cognitive computing system, was as a doctor’s assistant for cancer care. Now the company is working with hospitals and other partners to build prototypes for other cognitive tools in healthcare. IBM hopes using machine learning will make the process faster and give an additional layer of insight.